# Activation

Activation operators behavior like unary operator.

Few notes:

• The inputTensors and outputTensors should have same number of tensors.
• Corresponding input and output tensors should have same shapes.

All unary operators can be initialized:

let op = SomeActivationOp()


## List

#### ReLUOperator

$$y=max(x,alpha)$$

Initial: ReLUOperator(alpha : 0.0)

• alpha: Float, default values is 0.0

#### SigmoidOperator

$$y = 1 / (1 + e^{x})$$

#### SoftplusOperator

$$y = log_{e}(e^x + 1)$$

#### SoftsignOperator

$$y = x / (1 + abs(x))$$

#### LinearOperator

$$y = x$$

#### ELUOperator

$$$y= \begin{cases} x, & \text{if}\ x>0 \\ alpha \times (e^{x} - 1), & \text{otherwise} \end{cases}$$$

Initial: ELUOperator(alpha : 1.0)

• alpha: Float, default values is 1.0

#### SELUOperator

$$y = scale * ELU(x)$$

Initial: SELUOperator(alpha : 1.673263, scale: 1.050701)

• alpha: Float, default values is 1.673263
• scale: Float, default values is 1.050701

#### SoftmaxOperator

$$y = exp(x) / \text{reduce_sum}(e^x, \text{dim})$$

Initial: ELUOperator(dim : = -1)

• dim: Int, Reduce summing dimension. The value should be >=0. Any negative value will be automatically making this attribute value to -1. -1 is a special value indicating last dim.

#### LeakyReLUOperator

$$$y= \begin{cases} alpha \times x, & \text{if}\ x < 0 \\ x, & \text{otherwise} \end{cases}$$$

Initial: LeakyReLUOperator(alpha : 0.3)

• alpha: Float, default values is 0.3

#### ThresholdedReLUOperator

$$$y= \begin{cases} x, & \text{if}\ x > \text{alpha} \\ 0, & \text{otherwise} \end{cases}$$$

Initial: ThresholdedReLUOperator(alpha : 1.0)

• alpha: Float, default values is 1.0