diff options
author | Benjamin Fiske <bffiske@gmail.com> | 2022-05-04 23:33:40 -0400 |
---|---|---|
committer | Benjamin Fiske <bffiske@gmail.com> | 2022-05-04 23:33:40 -0400 |
commit | 98be0e58a000880d7e05e79f977452642eab54c6 (patch) | |
tree | d0edc4eeb39c77f2b3943de868d9a37494b91608 /hyperparameters.py | |
parent | 9d87471579c80d1c8baff6711c1297dec8f0dcf4 (diff) |
hp adjustments
Diffstat (limited to 'hyperparameters.py')
-rw-r--r-- | hyperparameters.py | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/hyperparameters.py b/hyperparameters.py index 4f264528..a0068dd1 100644 --- a/hyperparameters.py +++ b/hyperparameters.py @@ -9,14 +9,14 @@ Number of epochs. If you experiment with more complex networks you might need to increase this. Likewise if you add regularization that slows training. """ -num_epochs = 10 +num_epochs = 100 """ A critical parameter that can dramatically affect whether training succeeds or fails. The value for this depends significantly on which optimizer is used. Refer to the default learning rate parameter """ -learning_rate = 1e-4 +learning_rate = 3e-2 momentum = 0.01 |