Loading [Contrib]/a11y/accessibility-menu.js
\u200E
relu
-
paddle.nn.functional.
relu
(
x,
name=None
)
[源代码]
relu 激活层(Rectified Linear Unit)。计算公式如下:
\[relu(x) = max(0, x)\]
其中,\(x\) 为输入的 Tensor
代码示例
import paddle
import paddle.nn.functional as F
x = paddle.to_tensor([-2, 0, 1], dtype='float32')
out = F.relu(x)
print(out)
文档反馈