brelu

paddle.fluid.layers.nn. brelu ( x, t_min=0.0, t_max=24.0, name=None ) [source]

BRelu Activation Operator.

\(out = \min(\max(x, t_{min}), t_{max})\)

Parameters
  • x (Variable) – The input is a multi-dimensional Tensor. The data type is float32, float64

  • t_min (FLOAT|0.0) – The min marginal value of BRelu

  • t_max (FLOAT|24.0) – The max marginal value of BRelu

  • name (str|None) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name.

Returns

The output is a multi-dimensional Tensor which has same dimension and data type as the X

Return type

Variable

Examples:

import paddle.fluid as fluid
import paddle
import numpy as np
paddle.enable_static()

input_brelu = np.array([[-1,6],[1,15.6]])
with fluid.dygraph.guard():
    x = fluid.dygraph.to_variable(input_brelu)
    y = fluid.layers.brelu(x, t_min=1.0, t_max=10.0)
    print(y.numpy())
    #[[ 1.  6.]
    #[ 1. 10.]]