ThresholdedReLU¶
- class paddle.nn. ThresholdedReLU ( threshold=1.0, name=None ) [source]
-
Thresholded ReLU Activation
\[\begin{split}ThresholdedReLU(x) = \left\{ \begin{array}{rl} x,& \text{if } \ x > threshold \\ 0,& \text{otherwise} \end{array} \right.\end{split}\]- Parameters
-
threshold (float, optional) – The value of threshold for ThresholdedReLU. Default is 1.0
name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
- Shape:
-
input: Tensor with any shape.
output: Tensor with the same shape as input.
Examples
>>> import paddle >>> x = paddle.to_tensor([2., 0., 1.]) >>> m = paddle.nn.ThresholdedReLU() >>> out = m(x) >>> print(out) Tensor(shape=[3], dtype=float32, place=Place(cpu), stop_gradient=True, [2., 0., 0.])
-
forward
(
x
)
forward¶
-
Defines the computation performed at every call. Should be overridden by all subclasses.
- Parameters
-
*inputs (tuple) – unpacked tuple arguments
**kwargs (dict) – unpacked dict arguments
-
extra_repr
(
)
extra_repr¶
-
Extra representation of this layer, you can have custom implementation of your own layer.