diff options
author | Logan Bauman <logan_bauman@brown.edu> | 2022-05-06 23:32:17 -0400 |
---|---|---|
committer | Logan Bauman <logan_bauman@brown.edu> | 2022-05-06 23:32:17 -0400 |
commit | d802c988a57d6afe4fca979384ba377ecc7edb66 (patch) | |
tree | bf604e5da1bee0f2bf1ef16cc67df9a61dede2fa /hyperparameters.py | |
parent | 6e5f2d1a62f4f3bf0e87829082b2120ca440ddf0 (diff) |
hi
Diffstat (limited to 'hyperparameters.py')
-rw-r--r-- | hyperparameters.py | 6 |
1 files changed, 3 insertions, 3 deletions
diff --git a/hyperparameters.py b/hyperparameters.py index b03db017..460543dc 100644 --- a/hyperparameters.py +++ b/hyperparameters.py @@ -9,17 +9,17 @@ Number of epochs. If you experiment with more complex networks you might need to increase this. Likewise if you add regularization that slows training. """ -num_epochs = 5000 +num_epochs = 1000 """ A critical parameter that can dramatically affect whether training succeeds or fails. The value for this depends significantly on which optimizer is used. Refer to the default learning rate parameter """ -learning_rate = 3e-2 +learning_rate = 1e-2 momentum = 0.01 alpha = 1e-2 -beta = 1e-5 +beta = 1e-4 |