remove_weight_norm¶
移除传入 layer
中的权重归一化。
参数¶
layer (Layer) - 要移除权重归一化的层。
name (str,可选) - 权重参数的名字。默认值为
weight
。
返回¶
Layer
,移除权重归一化 hook 之后的层。
代码示例¶
>>> import paddle
>>> from paddle.nn import Conv2D
>>> from paddle.nn.utils import weight_norm, remove_weight_norm
>>> paddle.seed(2023)
>>> conv = Conv2D(3, 5, 3)
>>> wn = weight_norm(conv)
>>> print(conv.weight_g)
Parameter containing:
Tensor(shape=[5], dtype=float32, place=Place(cpu), stop_gradient=False,
[1.35883713, 1.32126212, 1.56303072, 1.20874095, 1.22893476])
>>> remove_weight_norm(conv)
>>> # The following is the effect after removing the weight norm:
>>> # print(conv.weight_g)
>>> # AttributeError: 'Conv2D' object has no attribute 'weight_g'