ELUOperator
public class ELUOperator: ActivationOperator
Exponential Linear Units.
y = x if x > 0, else y = alpha*(exp(x) - 1)
Reference
Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
-
hyperparameter, default is
1.0
Declaration
Swift
public var alpha:Float = 1.0
-
Convenience initializer Subclass required to override this function to assign
cpuComputeBlock
andmetalKernelFuncLabel
Declaration
Swift
required public convenience init(computationDelegate: OperatorCalculationDelegate? = nil)
Parameters
computationDelegate
computationDelegate
-
Initial by setting
alpha
valueDeclaration
Swift
public convenience init(computationDelegate: OperatorCalculationDelegate? = nil, alpha: Float)
Parameters
computationDelegate
computationDelegate description
alpha
alpha description
-
Attribute
alpha
as aScalarSymbol
.Declaration
Swift
public override func paramSymbols() -> [GraphSymbol]
Return Value
Array of GraphSymbol
-
Override CPU
Declaration
Swift
internal override func cpu()
-
Override GPU calculation cause there’s a hyperparameter to pass-in
Declaration
Swift
internal override func gpu()