You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+9-7Lines changed: 9 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -52,16 +52,18 @@ The flags require a value to be passed as the following argument.
52
52
./net datafile -lr 0.03
53
53
54
54
The following flags are available:
55
-
-r : read a previously trained network, the name of which is currently configured to be 'lstm_net.net'.
56
-
-lr: learning rate that is to be used during training, see the example above.
57
-
-it: the number of iterations used for training (not to be confused with epochs).
58
-
-mb: mini batch size.
59
-
-dl: decrease the learning rate over time, according to lr(n+1) <- lr(n) / (1 + n/value).
60
-
-st: number of iterations between how the network is continously stored during training (.json and .net).
55
+
-r : read a previously trained network, the name of which is currently configured to be 'lstm_net.net'.
56
+
-lr : learning rate that is to be used during training, see the example above.
57
+
-it : the number of iterations used for training (not to be confused with epochs).
58
+
-mb : mini batch size.
59
+
-dl : decrease the learning rate over time, according to lr(n+1) <- lr(n) / (1 + n/value).
60
+
-st : number of iterations between how the network is continously stored during training (.json and .net).
61
+
-out: number of characters to output directly, note: a network and a datafile must be provided.
61
62
62
63
Check std_conf.h to see what default values are used, these are set during compilation.
63
64
64
-
./net compiled Feb 14 2019 13:41:44
65
+
./net compiled Feb 14 2019 14:41:42
66
+
65
67
</pre>
66
68
67
69
The -st flags is great. Per default the network is stored upon interrupting the program with Ctrl-C. But using this argument, you can let the program train and have it store the network continously during the training process.
0 commit comments