class CooCoo::CostFunctions::SoftMaxCrossEntropy
Combines a SoftMax activation with CrossEntropy
. Due to math this is more optimal than having a SoftMax layer and doing CrossEntropy
seperately.
@see peterroelants.github.io/posts/neural_network_implementation_intermezzo02/
Public Class Methods
call(target, x)
click to toggle source
Calls superclass method
CooCoo::CostFunctions::CrossEntropy::call
# File lib/coo-coo/cost_functions.rb, line 82 def self.call(target, x) super(target, ActivationFunctions::ShiftedSoftMax.call(x)) end
derivative(target, x)
click to toggle source
# File lib/coo-coo/cost_functions.rb, line 86 def self.derivative(target, x) x - target end