class MachineLearningWorkbench::Compressor::DecayingLearningRateVQ
VQ with per-centroid decaying learning rates. Optimized for online training.
Attributes
decay_rate[R]
lrate_min[R]
lrate_min_den[R]
Public Class Methods
new(**opts)
click to toggle source
Calls superclass method
# File lib/machine_learning_workbench/compressor/decaying_learning_rate_vq.rb, line 10 def initialize **opts puts "Ignoring learning rate: `lrate: #{opts[:lrate]}`" if opts[:lrate] @lrate_min = opts.delete(:lrate_min) || 0.001 @lrate_min_den = opts.delete(:lrate_min_den) || 1 @decay_rate = opts.delete(:decay_rate) || 1 super **opts.merge({lrate: nil}) end
Public Instance Methods
check_lrate(lrate;)
click to toggle source
Overloading lrate check from original VQ
# File lib/machine_learning_workbench/compressor/decaying_learning_rate_vq.rb, line 19 def check_lrate lrate; nil; end
lrate(centr_idx, min_den: lrate_min_den, lower_bound: lrate_min, decay: decay_rate)
click to toggle source
Decaying per-centroid learning rate. @param centr_idx [Integer] index of the centroid @param lower_bound [Float] minimum learning rate @note nicely overloads the `attr_reader` of parent class
# File lib/machine_learning_workbench/compressor/decaying_learning_rate_vq.rb, line 25 def lrate centr_idx, min_den: lrate_min_den, lower_bound: lrate_min, decay: decay_rate [1.0/(ntrains[centr_idx]*decay+min_den), lower_bound].max .tap { |l| puts "centr: #{centr_idx}, ntrains: #{ntrains[centr_idx]}, lrate: #{l}" } end
train_one(vec, eps: nil)
click to toggle source
Train on one vector @return [Integer] index of trained centroid
# File lib/machine_learning_workbench/compressor/decaying_learning_rate_vq.rb, line 32 def train_one vec, eps: nil # NOTE: ignores epsilon if passed trg_idx, _simil = most_similar_centr(vec) # norm_vec = vec / NLinalg.norm(vec) # centrs[trg_idx, true] = centrs[trg_idx, true] * (1-lrate(trg_idx)) + norm_vec * lrate(trg_idx) centrs[trg_idx, true] = centrs[trg_idx, true] * (1-lrate(trg_idx)) + vec * lrate(trg_idx) trg_idx end