leaky_relu¶
- paddle.fluid.layers.nn. leaky_relu ( x, alpha=0.02, name=None ) [source]
-
LeakyRelu Activation Operator.
\(out = \max(x, \alpha * x)\)
- Parameters
-
x (Variable) – A LoDTensor or Tensor representing preactivation values. Must be one of the following types: float32, float64
alpha (FLOAT|0.02) – Slope of the activation function at x < 0
name (str|None) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name
- Returns
-
A LoDTensor or Tensor with the same type and size as that of x
- Return type
-
output(Variable)
Examples
import paddle x = paddle.to_tensor([[-1, 2], [3, -4]], dtype='float32') y = paddle.fluid.layers.leaky_relu(x, alpha=0.1) print(y) # [[-0.1, 2], [3, -0.4]]