Activation
Activation operators behavior like unary operator.
Few notes:
- The
inputTensors
andoutputTensors
should have same number of tensors. - Corresponding input and output tensors should have same shapes.
All unary operators can be initialized:
let op = SomeActivationOp()
List
ReLUOperator
\(y=max(x,alpha)\)
Initial: ReLUOperator(alpha : 0.0)
alpha
:Float
, default values is0.0
SigmoidOperator
\(y = 1 / (1 + e^{x})\)
SoftplusOperator
\(y = log_{e}(e^x + 1)\)
SoftsignOperator
\(y = x / (1 + abs(x))\)
LinearOperator
\(y = x\)
ELUOperator
\[\begin{equation} y= \begin{cases} x, & \text{if}\ x>0 \\ alpha \times (e^{x} - 1), & \text{otherwise} \end{cases} \end{equation}\]Initial: ELUOperator(alpha : 1.0)
alpha
:Float
, default values is1.0
SELUOperator
\(y = scale * ELU(x)\)
Initial: SELUOperator(alpha : 1.673263, scale: 1.050701)
alpha
:Float
, default values is1.673263
scale
:Float
, default values is1.050701
SoftmaxOperator
\(y = exp(x) / \text{reduce_sum}(e^x, \text{dim})\)
Initial: ELUOperator(dim : = -1)
dim
:Int
, Reduce summing dimension. The value should be>=0
. Any negative value will be automatically making this attribute value to-1
.-1
is a special value indicating last dim.
LeakyReLUOperator
\[\begin{equation} y= \begin{cases} alpha \times x, & \text{if}\ x < 0 \\ x, & \text{otherwise} \end{cases} \end{equation}\]Initial: LeakyReLUOperator(alpha : 0.3)
alpha
:Float
, default values is0.3
ThresholdedReLUOperator
\[\begin{equation} y= \begin{cases} x, & \text{if}\ x > \text{alpha} \\ 0, & \text{otherwise} \end{cases} \end{equation}\]Initial: ThresholdedReLUOperator(alpha : 1.0)
alpha
:Float
, default values is1.0