class MachineLearningWorkbench::NeuralNetwork::Recurrent
Recurrent
Neural Network
Public Instance Methods
activate_layer(nlay)
click to toggle source
Activates a layer of the network. Bit more complex since it has to copy the layer's activation on last input to its own inputs, for recursion. @param i [Integer] the layer to activate, zero-indexed
# File lib/machine_learning_workbench/neural_network/recurrent.rb, line 33 def activate_layer nlay # Mark begin and end of recursion outputs in current state begin_recur = nneurs(nlay) end_recur = nneurs(nlay) + nneurs(nlay+1) # Copy the level's last-time activation to the current input recurrency state[nlay][begin_recur...end_recur] = state[nlay+1][0...nneurs(nlay+1)] # Activate current layer act_fn.call state[nlay].dot layers[nlay] end
layer_row_sizes()
click to toggle source
Calculate the size of each row in a layer's weight matrix. Each row holds the inputs for the next level: previous level's activations (or inputs), this level's last activations (recursion) and bias. @return [Array<Integer>] per-layer row sizes
# File lib/machine_learning_workbench/neural_network/recurrent.rb, line 12 def layer_row_sizes @layer_row_sizes ||= struct.each_cons(2).collect do |prev, rec| prev + rec + 1 end end