diff options
author | Logan Bauman <logan_bauman@brown.edu> | 2022-05-07 14:42:18 -0400 |
---|---|---|
committer | Logan Bauman <logan_bauman@brown.edu> | 2022-05-07 14:42:18 -0400 |
commit | 26c962e42ca990f741c3667924543265fc38492f (patch) | |
tree | 8908430f57f27acd3319e220b2e59e4de95a1f1e /hyperparameters.py | |
parent | aa4999f22143be058cb73c829783bf5f894c7c0f (diff) |
hi
Diffstat (limited to 'hyperparameters.py')
-rw-r--r-- | hyperparameters.py | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/hyperparameters.py b/hyperparameters.py index 6c82a745..4760afe4 100644 --- a/hyperparameters.py +++ b/hyperparameters.py @@ -16,10 +16,10 @@ A critical parameter that can dramatically affect whether training succeeds or fails. The value for this depends significantly on which optimizer is used. Refer to the default learning rate parameter """ -learning_rate = 2e-3 +learning_rate = 4e-3 momentum = 0.01 alpha = 1e-2 -beta = 5e1 +beta = 1e2 |