aboutsummaryrefslogtreecommitdiff
path: root/hyperparameters.py
blob: 1d4c0e8dbf98ae25279505664c944ed179eca70c (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
"""
Homework 5 - CNNs
CS1430 - Computer Vision
Brown University
"""

"""
Number of epochs. If you experiment with more complex networks you
might need to increase this. Likewise if you add regularization that
slows training.
"""
num_epochs = 1000

"""
A critical parameter that can dramatically affect whether training
succeeds or fails. The value for this depends significantly on which
optimizer is used. Refer to the default learning rate parameter
"""
learning_rate = .002

"""
Beta_1 is the first hyperparameter for the Adam optimizer.
"""
beta_1 = .99

"""
epsilon for the Adam optimizer.
"""
epsilon = 1e-1

"""
A critical parameter for style transfer. The value for this will determine 
how much the generated image is "influenced" by the CONTENT image.
"""
alpha = .05

"""
A critical parameter for style transfer. The value for this will determine 
how much the generated image is "influenced" by the STYLE image.
"""
beta = 5