diff options
author | Michael Foiani <sotech117@michaels-mbp-4.devices.brown.edu> | 2022-05-10 01:32:39 -0400 |
---|---|---|
committer | Michael Foiani <sotech117@michaels-mbp-4.devices.brown.edu> | 2022-05-10 01:32:39 -0400 |
commit | 8b3745de9f8d411c99ecfc0e3c9b63c7b2a7ac71 (patch) | |
tree | 44325c93b8bd47422b7528a69c19306961a583d0 /hyperparameters.py | |
parent | c3a8fff5d9465b362214d84d30d9b1212d58722f (diff) |
Add final examples, remove unused data, fix some comments. Good to go :)HEADsubmission
Diffstat (limited to 'hyperparameters.py')
-rw-r--r-- | hyperparameters.py | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/hyperparameters.py b/hyperparameters.py index 1d4c0e8d..5d2722a7 100644 --- a/hyperparameters.py +++ b/hyperparameters.py @@ -16,7 +16,7 @@ A critical parameter that can dramatically affect whether training succeeds or fails. The value for this depends significantly on which optimizer is used. Refer to the default learning rate parameter """ -learning_rate = .002 +learning_rate = .0002 """ Beta_1 is the first hyperparameter for the Adam optimizer. |