Aquila
2.0 prealpha
Cognitive Robotics Architecture
|
This Aquila-compliant module trains multiple time-scales recurrent neural network (MTRNN) using back-propagation through time training (BPTT) algorithm
Launch this module with –help parameter to see the list of possible options
Input ports:
Output ports:
Needs a training file, which can be set from:
By default, trained MTRNN is saved to 'mtrnn.txt' file. This can be changed from:
Default values for various paramters are loaded from a configuration file, which can be modified in /conf/config.ini
The file consists in a few sections:
output - output file used for saving trained MTRNN show_progress - if 1 then training progress will be shown if 0 then not maxThreads 256 - maximum number of GPU threads per block iterations 10000 - maximum number of iterations ioDelta 1 - delta_t value that of input-output neurons fastDelta 1 - delta_t value that of fast neurons slowDelta 1 - delta_t value that of slow neurons numFastNeurons 10 - number of fast neurons numSlowNeurons 10 - number of slow neurons weightRange 0.025 - initial range of synapses (in this case from -0.025 to 0.025) threshold 0.0001 - error threshold that, once reached, will cause the training to stop learningRate 0.005 - learning rate used druing the training momentum 0.0 - momentum used during the training
Linux, OSX and Windows.
This will launch the training on 3 different GPUs (IDs: 0,1,2), which will divide the sequences provided in 'training_set.txt':
mtrnn –gpu 0 1 2 –trainingSet training_set.txt –iterations 10000 –learningRate 0.005