diff options
author | Logan Bauman <logan_bauman@brown.edu> | 2022-05-07 15:03:42 -0400 |
---|---|---|
committer | Logan Bauman <logan_bauman@brown.edu> | 2022-05-07 15:03:42 -0400 |
commit | f46f67c74b19c1db98b30e3f03c166f043079587 (patch) | |
tree | f064a8c68ad4b18d8706482aa501755fa3895795 /hyperparameters.py | |
parent | ec6b7241f0f9886a65465bf2fbfa9f854e0fa2fd (diff) |
hi
Diffstat (limited to 'hyperparameters.py')
-rw-r--r-- | hyperparameters.py | 6 |
1 files changed, 3 insertions, 3 deletions
diff --git a/hyperparameters.py b/hyperparameters.py index ec424dfa..6c82a745 100644 --- a/hyperparameters.py +++ b/hyperparameters.py @@ -9,17 +9,17 @@ Number of epochs. If you experiment with more complex networks you might need to increase this. Likewise if you add regularization that slows training. """ -num_epochs = 5000 +num_epochs = 7000 """ A critical parameter that can dramatically affect whether training succeeds or fails. The value for this depends significantly on which optimizer is used. Refer to the default learning rate parameter """ -learning_rate = 4e-3 +learning_rate = 2e-3 momentum = 0.01 alpha = 1e-2 -beta = 1e2 +beta = 5e1 |