silu

paddle.nn.functional. silu ( x, name=None ) [source]

silu activation

\[silu(x) = \frac{x}{1 + e^{-x}}\]
Parameters
  • x (Tensor) – The input Tensor with data type float32, float64.

  • name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Returns

A Tensor with the same data type and shape as x .

Examples

import paddle
import paddle.nn.functional as F

x = paddle.to_tensor([1.0, 2.0, 3.0, 4.0])
out = F.silu(x) # [ 0.731059, 1.761594, 2.857722, 3.928055 ]