class Chainer::Functions::Activation::Tanh

Hyperbolic tangent function.

Public Class Methods

tanh(x) click to toggle source

Elementwise hyperbolic tangent function.

$$ f(x)=\tanh(x). $$

@param [Chainer::Variable or Numo::NArray or Cumo::NArray] x Input variable. A $(s_1, s_2, …, s_N)$-shaped float array. @return [Chainer::Variable] Output variable. A $(s_1, s_2, …, s_N)$-shaped float array. @example

> x = Numo::SFloat.new(3).seq(-1, 2)
=> Numo::SFloat#shape=[3]
[-1, 1, 3]
> F = Chainer::Functions::Activation::Tanh
> F.tanh(x).data
=> Numo::SFloat#shape=[3]
[-0.761594, 0.761594, 0.995055]
# File lib/chainer/functions/activation/tanh.rb, line 23
def self.tanh(x)
  self.new.apply([x]).first
end

Public Instance Methods

backward(indexes, grad_outputs) click to toggle source
# File lib/chainer/functions/activation/tanh.rb, line 35
def backward(indexes, grad_outputs)
  if @use_cudnn
    x = get_retained_inputs.first.data
  else
    x = nil
  end

  y = get_retained_outputs.first
  gy = grad_outputs.first
  TanhGrad.new(x).apply([y, gy])
end
forward(x) click to toggle source
# File lib/chainer/functions/activation/tanh.rb, line 27
def forward(x)
  xm = Chainer.get_array_module(x[0])
  y = Utils::Array.force_array(xm::NMath.tanh(x[0]))
  retain_outputs([0])
  @use_cudnn = false
  [y]
end