SELU¶
- class paddle.nn. SELU ( scale=1.0507009873554805, alpha=1.6732632423543772, name=None ) [source]
-
SELU Activation
\[\begin{split}SELU(x)= scale * \left\{ \begin{array}{lcl} x,& &\text{if } \ x > 0 \\ alpha * e^{x} - alpha,& &\text{if } \ x <= 0 \end{array} \right.\end{split}\]- Parameters
-
scale (float, optional) – The value of scale(must be greater than 1.0) for SELU. Default is 1.0507009873554804934193349852946.
alpha (float, optional) – The value of alpha(must be no less than zero) for SELU. Default is 1.6732632423543772848170429916717.
name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
- Shape:
-
input: Tensor with any shape.
output: Tensor with the same shape as input.
Examples
import paddle x = paddle.to_tensor([[0.0, 1.0],[2.0, 3.0]]) m = paddle.nn.SELU() out = m(x) print(out) # [[0, 1.050701],[2.101402, 3.152103]]
-
forward
(
x
)
forward¶
-
Defines the computation performed at every call. Should be overridden by all subclasses.
- Parameters
-
*inputs (tuple) – unpacked tuple arguments
**kwargs (dict) – unpacked dict arguments
-
extra_repr
(
)
extra_repr¶
-
Extra representation of this layer, you can have custom implementation of your own layer.