Mish - 18 vs 22

Next section compares an older to a newer version of the same operator after both definition are converted into markdown text. Green means an addition to the newer version, red means a deletion. Anything else is unchanged.

Files changed (1) hide show
  1. Mish18 → Mish22 +2 -2
Mish18 → Mish22 RENAMED
@@ -1 +1 @@
1
1
  Mish: A Self Regularized Non-Monotonic Neural Activation Function.
2
2
  Perform the linear unit element-wise on the input tensor X using formula:
3
3
  mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + e^{x}))
4
4
  #### Function Body
5
5
  The function definition for this operator.
6
6
  <
7
7
  domain: "",
8
- opset_import: ["" : 18]
8
+ opset_import: ["" : 22]
9
9
  >
10
10
  Mish (X) => (Y)
11
11
  {
12
12
  Softplus_X = Softplus (X)
13
13
  TanHSoftplusX = Tanh (Softplus_X)
14
14
  Y = Mul (X, TanHSoftplusX)
15
15
  }
16
16
  ### Inputs
17
17
  - **X** (heterogeneous) - **T**:
18
18
  Input tensor
19
19
  ### Outputs
20
20
  - **Y** (heterogeneous) - **T**:
21
21
  Output tensor
22
22
  ### Type Constraints
23
- * **T** in ( tensor(double), tensor(float), tensor(float16) ):
23
+ * **T** in ( tensor(bfloat16), tensor(double), tensor(float), tensor(float16) ):
24
24
  Constrain input X and output types to float tensors.