(l-onnx-doc-LeakyRelu)= # LeakyRelu (l-onnx-op-leakyrelu-16)= ## LeakyRelu - 16 ### Version - **name**: [LeakyRelu (GitHub)](https://github.com/onnx/onnx/blob/main/docs/Operators.md#LeakyRelu) - **domain**: `main` - **since_version**: `16` - **function**: `True` - **support_level**: `SupportType.COMMON` - **shape inference**: `True` This version of the operator has been available **since version 16**. ### Summary LeakyRelu takes input data (Tensor) and an argument alpha, and produces one output data (Tensor) where the function `f(x) = alpha * x for x < 0`, `f(x) = x for x >= 0`, is applied to the data tensor elementwise. #### Function Body The function definition for this operator. ``` < domain: "", opset_import: ["" : 16] > LeakyRelu (X) => (Y) { Alpha = Constant () AlphaCast = CastLike (Alpha, X) Zero = Constant () ZeroCast = CastLike (Zero, X) XLessThanZero = Less (X, ZeroCast) AlphaMulX = Mul (AlphaCast, X) Y = Where (XLessThanZero, AlphaMulX, X) } ``` ### Attributes * **alpha - FLOAT** (default is `'0.01'`): Coefficient of leakage. ### Inputs - **X** (heterogeneous) - **T**: Input tensor ### Outputs - **Y** (heterogeneous) - **T**: Output tensor ### Type Constraints * **T** in ( `tensor(bfloat16)`, `tensor(double)`, `tensor(float)`, `tensor(float16)` ): Constrain input and output types to float tensors. ```{toctree} text_diff_LeakyRelu_6_16 ``` (l-onnx-op-leakyrelu-6)= ## LeakyRelu - 6 ### Version - **name**: [LeakyRelu (GitHub)](https://github.com/onnx/onnx/blob/main/docs/Operators.md#LeakyRelu) - **domain**: `main` - **since_version**: `6` - **function**: `False` - **support_level**: `SupportType.COMMON` - **shape inference**: `True` This version of the operator has been available **since version 6**. ### Summary LeakyRelu takes input data (Tensor) and an argument alpha, and produces one output data (Tensor) where the function `f(x) = alpha * x for x < 0`, `f(x) = x for x >= 0`, is applied to the data tensor elementwise. ### Attributes * **alpha - FLOAT** (default is `'0.01'`): Coefficient of leakage. ### Inputs - **X** (heterogeneous) - **T**: Input tensor ### Outputs - **Y** (heterogeneous) - **T**: Output tensor ### Type Constraints * **T** in ( `tensor(double)`, `tensor(float)`, `tensor(float16)` ): Constrain input and output types to float tensors. ```{toctree} text_diff_LeakyRelu_1_16 text_diff_LeakyRelu_1_6 ``` (l-onnx-op-leakyrelu-1)= ## LeakyRelu - 1 ### Version - **name**: [LeakyRelu (GitHub)](https://github.com/onnx/onnx/blob/main/docs/Operators.md#LeakyRelu) - **domain**: `main` - **since_version**: `1` - **function**: `False` - **support_level**: `SupportType.COMMON` - **shape inference**: `False` This version of the operator has been available **since version 1**. ### Summary LeakyRelu takes input data (Tensor) and an argument alpha, and produces one output data (Tensor) where the function `f(x) = alpha * x for x < 0`, `f(x) = x for x >= 0`, is applied to the data tensor elementwise. ### Attributes * **alpha - FLOAT** (default is `'0.01'`): Coefficient of leakage default to 0.01. * **consumed_inputs - INTS** : legacy optimization attribute. ### Inputs - **X** (heterogeneous) - **T**: Input tensor ### Outputs - **Y** (heterogeneous) - **T**: Output tensor ### Type Constraints * **T** in ( `tensor(double)`, `tensor(float)`, `tensor(float16)` ): Constrain input and output types to float tensors.