ELU¶
- class paddle.nn. ELU ( alpha=1.0, name=None ) [source]
-
ELU Activation.
\[\begin{split}ELU(x)= \left\{ \begin{array}{lcl} x,& &\text{if } \ x > 0 \\ alpha * (e^{x} - 1),& &\text{if } \ x <= 0 \end{array} \right.\end{split}\]- Parameters
-
alpha (float, optional) – The ‘alpha’ value of the ELU formulation. Default is 1.0.
name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
- Shape:
-
input: Tensor with any shape.
output: Tensor with the same shape as input.
Examples
>>> import paddle >>> x = paddle.to_tensor([[-1. ,6.], [1., 15.6]]) >>> m = paddle.nn.ELU(0.2) >>> out = m(x) >>> print(out) Tensor(shape=[2, 2], dtype=float32, place=Place(cpu), stop_gradient=True, [[-0.12642412, 6. ], [ 1. , 15.60000038]])
-
forward
(
x
)
forward¶
-
Defines the computation performed at every call. Should be overridden by all subclasses.
- Parameters
-
*inputs (tuple) – unpacked tuple arguments
**kwargs (dict) – unpacked dict arguments
-
extra_repr
(
)
extra_repr¶
-
Extra representation of this layer, you can have custom implementation of your own layer.