# Mish¶

## Mish - 22¶

### Version¶

• name: Mish (GitHub)

• domain: main

• since_version: 22

• function: True

• support_level: SupportType.COMMON

• shape inference: True

This version of the operator has been available since version 22.

### Summary¶

Mish: A Self Regularized Non-Monotonic Neural Activation Function.

Perform the linear unit element-wise on the input tensor X using formula:

mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + e^{x}))


#### Function Body¶

The function definition for this operator.

<
domain: "",
opset_import: ["" : 22]
>
Mish (X) => (Y)
{
Softplus_X = Softplus (X)
TanHSoftplusX = Tanh (Softplus_X)
Y = Mul (X, TanHSoftplusX)
}


### Inputs¶

• X (heterogeneous) - T:

Input tensor

### Outputs¶

• Y (heterogeneous) - T:

Output tensor

### Type Constraints¶

• T in ( tensor(bfloat16), tensor(double), tensor(float), tensor(float16) ):

Constrain input X and output types to float tensors.

## Mish - 18¶

### Version¶

• name: Mish (GitHub)

• domain: main

• since_version: 18

• function: True

• support_level: SupportType.COMMON

• shape inference: True

This version of the operator has been available since version 18.

### Summary¶

Mish: A Self Regularized Non-Monotonic Neural Activation Function.

Perform the linear unit element-wise on the input tensor X using formula:

mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + e^{x}))


#### Function Body¶

The function definition for this operator.

<
domain: "",
opset_import: ["" : 18]
>
Mish (X) => (Y)
{
Softplus_X = Softplus (X)
TanHSoftplusX = Tanh (Softplus_X)
Y = Mul (X, TanHSoftplusX)
}


### Inputs¶

• X (heterogeneous) - T:

Input tensor

### Outputs¶

• Y (heterogeneous) - T:

Output tensor

### Type Constraints¶

• T in ( tensor(double), tensor(float), tensor(float16) ):

Constrain input X and output types to float tensors.