leaky_relu¶
- paddle.nn.functional. leaky_relu ( x, negative_slope=0.01, name=None ) [source]
-
leaky_relu activation. The calculation formula is:
\[\begin{split}leaky\_relu(x)= \left\{ \begin{array}{rcl} x, & & if \ x >= 0 \\ negative\_slope * x, & & otherwise \\ \end{array} \right.\end{split}\]- Parameters
-
x (Tensor) – The input Tensor with data type float32, float64.
negative_slope (float, optional) – Slope of the activation function at \(x < 0\) . Default is 0.01.
name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None.
- Returns
-
A Tensor with the same data type and shape as
x
.
Examples
>>> import paddle >>> import paddle.nn.functional as F >>> x = paddle.to_tensor([-2., 0., 1.]) >>> out = F.leaky_relu(x) >>> print(out) Tensor(shape=[3], dtype=float32, place=Place(cpu), stop_gradient=True, [-0.02000000, 0. , 1. ])