cosine_decay¶
- paddle.fluid.layers.learning_rate_scheduler. cosine_decay ( learning_rate, step_each_epoch, epochs ) [source]
-
Applies cosine decay to the learning rate.
when training a model, it is often recommended to lower the learning rate as the training progresses. By using this function, the learning rate will be decayed by following cosine decay strategy.
\[\begin{split}decayed\_lr = learning\_rate * 0.5 * (math.cos * (epoch * \\frac{math.pi}{epochs} ) + 1)\end{split}\]- Parameters
-
learning_rate (Variable|float) – The initial learning rate.
step_each_epoch (int) – the number of steps in an epoch.
epochs (int) – the number of epochs.
- Returns
-
The decayed learning rate.
- Return type
-
Variable
Examples
import paddle.fluid as fluid base_lr = 0.1 lr = fluid.layers.cosine_decay( learning_rate = base_lr, step_each_epoch=10000, epochs=120)