layer_norm¶
- paddle.static.nn. layer_norm ( input, scale=True, shift=True, begin_norm_axis=1, epsilon=1e-05, param_attr=None, bias_attr=None, act=None, name=None ) [source]
-
- Api_attr
-
Static Graph
Layer Normalization Layer
The API implements the function of the Layer Normalization Layer and can be applied to mini-batch input data. Refer to Layer Normalization
The formula is as follows:
mu=frac1HsumHi=1xisigma=sqrtfrac1HH∑i=1(xi−mu)2+epsilony=f(fracgsigma(x−mu)+b)x: the vector representation of the summed inputs to the neurons in that layer.
H: the number of hidden units in a layers
epsilon: the small value added to the variance to prevent division by zero.
g: the trainable scale parameter.
b: the trainable bias parameter.
- Parameters
-
input (Tensor) – A multi-dimension
Tensor
, and the data type is float32 or float64.scale (bool, optional) – Whether to learn the adaptive gain g after normalization. Default: True.
shift (bool, optional) – Whether to learn the adaptive bias b after normalization. Default: True.
begin_norm_axis (int, optional) – The normalization will be performed along dimensions from
begin_norm_axis
torank(input)
. Default: 1.epsilon (float, optional) – The small value added to the variance to prevent division by zero. Default: 1e-05.
param_attr (ParamAttr, optional) – The parameter attribute for the learnable gain g. If
scale
is False,param_attr
is omitted. Ifscale
is True andparam_attr
is None, a defaultParamAttr
would be added as scale. Theparam_attr
is initialized as 1 if it is added. Default: None.bias_attr (ParamAttr, optional) – The parameter attribute for the learnable bias b. If
shift
is False,bias_attr
is omitted. Ifshift
is True andparam_attr
is None, a defaultParamAttr
would be added as bias. Thebias_attr
is initialized as 0 if it is added. Default: None.act (str, optional) – Activation to be applied to the output of layer normalization. Default: None.
name (str) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name .
- Returns
-
Tensor
indicating the normalized result, the data type is the same asinput
, and the return dimension is the same asinput
. - Return type
-
Tensor
Examples
import paddle paddle.enable_static() x = paddle.static.data(name='x', shape=[8, 32, 32], dtype='float32') output = paddle.static.nn.layer_norm(input=x, begin_norm_axis=1) print(output.shape) # [8, 32, 32]