LeakyReLU
- class paddle.nn. LeakyReLU ( negative_slope: float = 0.01, name: Optional[str] = None ) [source]
-
Leaky ReLU Activation. Create a callable object of LeakyReLU to calculate the LeakyReLU of input x.
LeakyReLU(x)={x,if x>=0negative_slope∗x,otherwise- Parameters
-
negative_slope (float, optional) – Slope of the activation function at x<0 . Default is 0.01.
name (str|None, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
- Shape:
-
input: Tensor with any shape.
output: Tensor with the same shape as input.
Examples
>>> import paddle >>> m = paddle.nn.LeakyReLU() >>> x = paddle.to_tensor([-2.0, 0, 1]) >>> out = m(x) >>> print(out) Tensor(shape=[3], dtype=float32, place=Place(cpu), stop_gradient=True, [-0.02000000, 0. , 1. ])
-
forward
(
x: Tensor
)
Tensor
forward¶
-
Defines the computation performed at every call. Should be overridden by all subclasses.
- Parameters
-
*inputs (tuple) – unpacked tuple arguments
**kwargs (dict) – unpacked dict arguments
-
extra_repr
(
)
str
extra_repr¶
-
Extra representation of this layer, you can have custom implementation of your own layer.