selu¶
- paddle.nn.functional. selu ( x, scale=1.0507009873554805, alpha=1.6732632423543772, name=None ) [source]
-
selu activation
selu(x)=scale∗{x,if x>0alpha∗ex−alpha,if x<=0- Parameters
-
x (Tensor) – The input Tensor with data type float32, float64.
scale (float, optional) – The value of scale(must be greater than 1.0) for selu. Default is 1.0507009873554804934193349852946
alpha (float, optional) – The value of alpha(must be no less than zero) for selu. Default is 1.6732632423543772848170429916717
name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
- Returns
-
A Tensor with the same data type and shape as
x
.
Examples
import paddle import paddle.nn.functional as F import numpy as np x = paddle.to_tensor(np.array([[0.0, 1.0],[2.0, 3.0]])) out = F.selu(x) # [[0, 1.050701],[2.101402, 3.152103]]