mish

paddle.fluid.layers.nn. mish ( x, threshold=20, name=None ) [source]

This operator implements the mish activation function. Refer to Mish: A Self Regularized Non-Monotonic Neural Activation Function

The formula is as follows if threshold is None or negative:

\[\begin{split}out = x * \\tanh(\\ln(1 + e^{x}))\end{split}\]

The formula is as follows if threshold is set as positive value:

\[\begin{split}out = \\begin{cases} x \\ast \\tanh(x), \\text{if } x > \\text{threshold} \\\\ x \\ast \\tanh(e^{x}), \\text{if } x < -\\text{threshold} \\\\ x \\ast \\tanh(\\ln(1 + e^{x})), \\text{otherwise} \\end{cases}\end{split}\]
Parameters
  • x (Variable) – Input feature, multi-dimensional Tensor. The data type should be float16, float32 or float64.

  • threshold (float|None) – threshold for softplus in Mish operator. Approximate value of softplus will be used if absolute value of input is greater than :attr:threshold and :attr:threshold is set as positive value. For none or negative threshold, approximate value is not used. Default 20.

  • name (str, optional) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name

Returns

The output tensor with the same shape and data type as input.

Return type

Variable

Examples:

import paddle.fluid as fluid
import numpy as np

DATATYPE='float32'

x_data = np.array([i for i in range(1,5)]).reshape([1,1,4]).astype(DATATYPE)

x = fluid.data(name="x", shape=[None,1,4], dtype=DATATYPE)
y = fluid.layers.mish(x)

place = fluid.CPUPlace()
# place = fluid.CUDAPlace(0)
exe = fluid.Executor(place)
out, = exe.run(feed={'x':x_data}, fetch_list=[y.name])
print(out)  # [[0.66666667, 1.66666667, 3., 4.]]