diff options
author | Logan Bauman <logan_bauman@brown.edu> | 2022-05-07 15:58:27 -0400 |
---|---|---|
committer | Logan Bauman <logan_bauman@brown.edu> | 2022-05-07 15:58:27 -0400 |
commit | 00991837cc0bbb62b98ab3024ea795a18cf2dde8 (patch) | |
tree | c01379070ee0d3ac105dc0c08692e1de9a37b7be /hyperparameters.py | |
parent | 266708a7fdcc0c9de5e64970c69e1722cb76e5b6 (diff) |
hi
Diffstat (limited to 'hyperparameters.py')
-rw-r--r-- | hyperparameters.py | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/hyperparameters.py b/hyperparameters.py index 6c82a745..a15d04ac 100644 --- a/hyperparameters.py +++ b/hyperparameters.py @@ -9,14 +9,14 @@ Number of epochs. If you experiment with more complex networks you might need to increase this. Likewise if you add regularization that slows training. """ -num_epochs = 7000 +num_epochs = 10000 """ A critical parameter that can dramatically affect whether training succeeds or fails. The value for this depends significantly on which optimizer is used. Refer to the default learning rate parameter """ -learning_rate = 2e-3 +learning_rate = .002 momentum = 0.01 |