class Chainer::Functions::Activation::ReLUGrad2
Computes the gradient of the ReLU function.
This function takes 2 variables b and c, and computes f(b, c) = sign(b) * c with backpropagation where operations are dones in elementwise manner and sign(x) = 1 when x > 0 is positive and 0 otherwise. As the gradient of f with respect to b is 0, we do not backpropagate errors toward b for computational efficiency.<Paste>
Public Class Methods
new(b)
click to toggle source
# File lib/chainer/functions/activation/relu_grad2.rb, line 13 def initialize(b) @b = b.data end
Public Instance Methods
backward(indexes, gy)
click to toggle source
# File lib/chainer/functions/activation/relu_grad2.rb, line 22 def backward(indexes, gy) [gy[0] * heaviside(@b)] end
forward(inputs)
click to toggle source
# File lib/chainer/functions/activation/relu_grad2.rb, line 17 def forward(inputs) y = inputs[0] * (@b > 0) [Utils::Array.force_array(y, y.class)] end
Private Instance Methods
heaviside(x)
click to toggle source
# File lib/chainer/functions/activation/relu_grad2.rb, line 28 def heaviside(x) (x > 0).cast_to(x.class) end