Elu¶
Elu - 22¶
Version¶
name: Elu (GitHub)
domain:
main
since_version:
22
function:
True
support_level:
SupportType.COMMON
shape inference:
True
This version of the operator has been available since version 22.
Summary¶
Elu takes one input data (Tensorf(x) = alpha * (exp(x) - 1.) for x < 0
, f(x) = x for x >= 0
., is applied to the tensor elementwise.
Function Body¶
The function definition for this operator.
<
domain: "",
opset_import: ["" : 18]
>
Elu <alpha>(X) => (Y)
{
Alpha = Constant <value_float: float = @alpha> ()
AlphaCast = CastLike (Alpha, X)
Zero = Constant <value: tensor = float {0}> ()
ZeroCast = CastLike (Zero, X)
One = Constant <value: tensor = float {1}> ()
OneCast = CastLike (One, X)
XLessThanZero = Less (X, ZeroCast)
ExpX = Exp (X)
ExpXSubOne = Sub (ExpX, OneCast)
AlphaMulExpXSubOne = Mul (AlphaCast, ExpXSubOne)
Y = Where (XLessThanZero, AlphaMulExpXSubOne, X)
}
Attributes¶
alpha - FLOAT (default is
'1.0'
):Coefficient of ELU.
Inputs¶
X (heterogeneous) - T:
1D input tensor
Outputs¶
Y (heterogeneous) - T:
1D output tensor
Type Constraints¶
T in (
tensor(bfloat16)
,tensor(double)
,tensor(float)
,tensor(float16)
):Constrain input and output types to float tensors.
Elu - 6¶
Version¶
name: Elu (GitHub)
domain:
main
since_version:
6
function:
True
support_level:
SupportType.COMMON
shape inference:
True
This version of the operator has been available since version 6.
Summary¶
Elu takes one input data (Tensorf(x) = alpha * (exp(x) - 1.) for x < 0
, f(x) = x for x >= 0
., is applied to the tensor elementwise.
Function Body¶
The function definition for this operator.
<
domain: "",
opset_import: ["" : 18]
>
Elu <alpha>(X) => (Y)
{
Alpha = Constant <value_float: float = @alpha> ()
AlphaCast = CastLike (Alpha, X)
Zero = Constant <value: tensor = float {0}> ()
ZeroCast = CastLike (Zero, X)
One = Constant <value: tensor = float {1}> ()
OneCast = CastLike (One, X)
XLessThanZero = Less (X, ZeroCast)
ExpX = Exp (X)
ExpXSubOne = Sub (ExpX, OneCast)
AlphaMulExpXSubOne = Mul (AlphaCast, ExpXSubOne)
Y = Where (XLessThanZero, AlphaMulExpXSubOne, X)
}
Attributes¶
alpha - FLOAT (default is
'1.0'
):Coefficient of ELU.
Inputs¶
X (heterogeneous) - T:
1D input tensor
Outputs¶
Y (heterogeneous) - T:
1D output tensor
Type Constraints¶
T in (
tensor(double)
,tensor(float)
,tensor(float16)
):Constrain input and output types to float tensors.
Elu - 1¶
Version¶
name: Elu (GitHub)
domain:
main
since_version:
1
function:
False
support_level:
SupportType.COMMON
shape inference:
False
This version of the operator has been available since version 1.
Summary¶
Elu takes one input data (Tensorf(x) = alpha * (exp(x) - 1.) for x < 0
, f(x) = x for x >= 0
., is applied to the tensor elementwise.
Attributes¶
alpha - FLOAT (default is
'1.0'
):Coefficient of ELU default to 1.0.
consumed_inputs - INTS :
legacy optimization attribute.
Inputs¶
X (heterogeneous) - T:
1D input tensor
Outputs¶
Y (heterogeneous) - T:
1D input tensor
Type Constraints¶
T in (
tensor(double)
,tensor(float)
,tensor(float16)
):Constrain input and output types to float tensors.