class MachineLearningWorkbench::Optimizer::NaturalEvolutionStrategies::Base
Natural Evolution Strategies base class
Attributes
Public Class Methods
NES object initialization @param ndims [Integer] number of parameters to optimize @param obj_fn
[#call] any object defining a call method (Proc, lambda, custom class) @param opt_type
[:min, :max] select minimization / maximization of obj_fn
@param rseed [Integer] allow for deterministic execution on rseed provided @param mu_init [Numeric] values to initalize the distribution's mean @param sigma_init [Numeric] values to initialize the distribution's covariance @param parallel_fit
[boolean] whether the `obj_fn` should be passed all the
individuals together. In the canonical case the fitness function always scores a single individual; in practical cases though it is easier to delegate the scoring parallelization to the external fitness function. Turning this to `true` will make the algorithm pass _an Array_ of individuals to the fitness function, rather than a single instance.
@param rescale_popsize
[Float] scaling for the default population size @param rescale_lrate
[Float] scaling for the default learning rate
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 23 def initialize ndims, obj_fn, opt_type, rseed: nil, mu_init: 0, sigma_init: 1, parallel_fit: false, rescale_popsize: 1, rescale_lrate: 1, utilities: nil, popsize: nil, lrate: nil raise ArgumentError, "opt_type: #{opt_type}" unless [:min, :max].include? opt_type raise ArgumentError, "obj_fn not callable: #{obj_fn}" unless obj_fn.respond_to? :call raise ArgumentError, "utilities only if popsize" if utilities && popsize.nil? raise ArgumentError, "wrong sizes" if utilities && utilities.size != popsize raise ArgumentError, "minimum popsize 5 for default utilities" if popsize&.<(5) && utilities.nil? @ndims, @opt_type, @obj_fn, @parallel_fit = ndims, opt_type, obj_fn, parallel_fit @rescale_popsize, @rescale_lrate = rescale_popsize, rescale_lrate # rescale defaults @utilities, @popsize, @lrate = utilities, popsize, lrate # if not set, defaults below @eye = NArray.eye(ndims) rseed ||= Random.new_seed # puts "NES rseed: #{s}" # currently disabled @rng = Random.new rseed @best = [(opt_type==:max ? -1 : 1) * Float::INFINITY, nil] @last_fits = [] initialize_distribution mu_init: mu_init, sigma_init: sigma_init end
Public Instance Methods
(see cmaes_utilities
) @return [Float] learning rate lower bound
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 77 def cmaes_lrate (3+Math.log(ndims)) / (5*Math.sqrt(ndims)) end
(see cmaes_utilities
) @return [Integer] population size lower bound
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 83 def cmaes_popsize [5, 4 + (3*Math.log(ndims)).floor].max end
Magic numbers from CMA-ES (see `README` for citation) @return [NArray] scale-invariant utilities
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 62 def cmaes_utilities # Algorithm equations are meant for fitness maximization # Match utilities with individuals sorted by INCREASING fitness raise ArgumentError, "Minimum `popsize` should be 5 (is #{popsize})" if popsize < 5 log_range = (1..popsize).collect do |v| [0, Math.log(popsize.to_f/2 - 1) - Math.log(v)].max end total = log_range.reduce(:+) buf = 1.0/popsize vals = log_range.collect { |v| v / total - buf }.reverse NArray[vals] end
(see utils
)
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 58 def lrate; @lrate ||= cmaes_lrate * rescale_lrate end
Move standard normal samples to current distribution @return [NArray] individuals
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 99 def move_inds inds # TODO: can we reduce the transpositions? # multi_mu = NMatrix[*inds.rows.times.collect {mu.to_a}, dtype: dtype].transpose # (multi_mu + sigma.dot(inds.transpose)).transpose mu_tile = mu.tile(inds.shape.first, 1).transpose (mu_tile + sigma.dot(inds.transpose)).transpose end
(see utils
)
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 56 def popsize; @popsize ||= Integer(cmaes_popsize * rescale_popsize) end
Sorted individuals NOTE: Algorithm equations are meant for fitness maximization. Utilities need to be matched with individuals sorted by INCREASING fitness. Then reverse order for minimization. @return standard normal samples sorted by the respective individuals' fitnesses
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 113 def sorted_inds # Xumo::NArray implements the Box-Muller, but no random seed (yet) samples = standard_normal_samples # samples = NArray.new([popsize, ndims]).rand_norm(0,1) inds = move_inds(samples) fits = parallel_fit ? obj_fn.call(inds) : inds.map(&obj_fn) # Quick cure for NaN fitnesses fits.map { |x| x.nan? ? (opt_type==:max ? -1 : 1) * Float::INFINITY : x } @last_fits = fits # allows checking for stagnation # sorted = [fits.to_a, inds, samples.to_a].transpose.sort_by(&:first) # sorted.reverse! if opt_type==:min # this_best = sorted.last.take(2) # NArray[*sorted.map(&:last)] # BUG IN NARRAY SORT!! ruby-numo/numo-narray#97 # sort_idxs = fits.sort_index sort_idxs = fits.size.times.sort_by { |i| fits[i] }.to_na sort_idxs = sort_idxs.reverse if opt_type == :min this_best = [fits[sort_idxs[-1]], inds[sort_idxs[-1], true]] opt_cmp_fn = opt_type==:min ? :< : :> @best = this_best if this_best.first.send(opt_cmp_fn, best.first) samples[sort_idxs, true] end
Box-Muller transform: generates standard (unit) normal distribution samples @return [Float] a single sample from a standard normal distribution @note Xumo::NArray implements this but no random seed selection yet
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 44 def standard_normal_sample rho = Math.sqrt(-2.0 * Math.log(rng.rand)) theta = 2 * Math::PI * rng.rand tfn = rng.rand > 0.5 ? :cos : :sin rho * Math.send(tfn, theta) end
Samples a standard normal distribution to construct a NArray of
popsize multivariate samples of length ndims
@return [NArray] standard normal samples @note Xumo::NArray implements this but no random seed selection yet
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 91 def standard_normal_samples NArray.zeros([popsize, ndims]).tap do |ret| ret.each_with_index { |_,*i| ret[*i] = standard_normal_sample } end end
Memoized automatic magic numbers Initialization options allow to rescale or entirely override these. NOTE: Doubling popsize and halving lrate often helps
# File lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb, line 54 def utils; @utilities ||= cmaes_utilities end