PReLU¶
- class paddle.nn. PReLU ( num_parameters=1, init=0.25, weight_attr=None, data_format='NCHW', name=None ) [source]
-
PReLU Activation. The calculation formula is follows:
If approximate calculation is used:
\[PReLU(x) = max(0, x) + weight * min(0, x)\]x is input Tensor.
- Parameters
-
num_parameters (int, optional) – Number of weight to learn. The supported values are: 1 - a single parameter alpha is used for all input channels; Number of channels - a separate alpha is used for each input channel. Default is 1.
init (float, optional) – Init value of learnable weight. Default is 0.25.
weight_attr (ParamAttr, optional) – The parameter attribute for the learnable weight. Default is None. For more information, please refer to ParamAttr.
name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
data_format (str, optional) – Data format that specifies the layout of input. It may be “NC”, “NCL”, “NCHW”, “NCDHW”, “NLC”, “NHWC” or “NDHWC”. Default: “NCHW”.
- Shape:
-
input: Tensor with any shape. Default dtype is float32.
output: Tensor with the same shape as input.
Examples
>>> import paddle >>> data = paddle.to_tensor([[[[-2.0, 3.0, -4.0, 5.0], ... [ 3.0, -4.0, 5.0, -6.0], ... [-7.0, -8.0, 8.0, 9.0]], ... [[ 1.0, -2.0, -3.0, 4.0], ... [-5.0, 6.0, 7.0, -8.0], ... [ 6.0, 7.0, 8.0, 9.0]]]]) ... >>> m = paddle.nn.PReLU(1, 0.25) >>> out = m(data) >>> print(out) Tensor(shape=[1, 2, 3, 4], dtype=float32, place=Place(cpu), stop_gradient=False, [[[[-0.50000000, 3. , -1. , 5. ], [ 3. , -1. , 5. , -1.50000000], [-1.75000000, -2. , 8. , 9. ]], [[ 1. , -0.50000000, -0.75000000, 4. ], [-1.25000000, 6. , 7. , -2. ], [ 6. , 7. , 8. , 9. ]]]])
-
forward
(
x
)
forward¶
-
Defines the computation performed at every call. Should be overridden by all subclasses.
- Parameters
-
*inputs (tuple) – unpacked tuple arguments
**kwargs (dict) – unpacked dict arguments
-
extra_repr
(
)
extra_repr¶
-
Extra representation of this layer, you can have custom implementation of your own layer.