remove_weight_norm¶
移除传入 layer
中的权重归一化。
参数¶
layer (Layer) - 要移除权重归一化的层。
name (str,可选) - 权重参数的名字。默认值为
weight
。
返回¶
Layer
,移除权重归一化 hook 之后的层。
代码示例¶
import paddle
from paddle.nn import Conv2D
from paddle.nn.utils import weight_norm, remove_weight_norm
conv = Conv2D(3, 5, 3)
wn = weight_norm(conv)
print(conv.weight_g)
# Parameter containing:
# Tensor(shape=[5], dtype=float32, place=Place(gpu:0), stop_gradient=False,
# [0., 0., 0., 0., 0.])
# Conv2D(3, 5, kernel_size=[3, 3], data_format=NCHW)
remove_weight_norm(conv)
# The following is the effect after removing the weight norm:
# print(conv.weight_g)
# AttributeError: 'Conv2D' object has no attribute 'weight_g'