评价此页

CosineAnnealingLR#

class torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max, eta_min=0.0, last_epoch=-1)[source]#

使用余弦退火调度设置每个参数组的学习率。

The learning rate is updated recursively using

ηt+1=ηmin+(ηtηmin)1+cos((Tcur+1)πTmax)1+cos(TcurπTmax)\eta_{t+1} = \eta_{\min} + (\eta_t - \eta_{\min}) \cdot \frac{1 + \cos\left(\frac{(T_{cur}+1) \pi}{T_{max}}\right)} {1 + \cos\left(\frac{T_{cur} \pi}{T_{max}}\right)}

This implements a recursive approximation of the closed-form schedule proposed in SGDR: Stochastic Gradient Descent with Warm Restarts

ηt=ηmin+12(ηmaxηmin)(1+cos(TcurπTmax))\eta_t = \eta_{\min} + \frac{1}{2}(\eta_{\max} - \eta_{\min}) \left( 1 + \cos\left(\frac{T_{cur} \pi}{T_{max}}\right) \right)

其中

  • ηt\eta_t is the learning rate at step tt

  • TcurT_{cur} is the number of epochs since the last restart

  • TmaxT_{max} is the maximum number of epochs in a cycle

注意

Although SGDR includes periodic restarts, this implementation performs cosine annealing without restarts, so Tcur=tT_{cur} = t and increases monotonically with each call to step().

参数
  • optimizer (Optimizer) – 包装的优化器。

  • T_max (int) – Maximum number of iterations.

  • eta_min (float) – Minimum learning rate. Default: 0.

  • last_epoch (int) – The index of the last epoch. Default: -1.

示例

>>> num_epochs = 100
>>> scheduler = CosineAnnealingLR(optimizer, T_max=num_epochs)
>>> for epoch in range(num_epochs):
>>>     train(...)
>>>     validate(...)
>>>     scheduler.step()
../_images/CosineAnnealingLR.png
get_last_lr()[source]#

返回当前调度器计算的最后一个学习率。

返回类型

list[float]

get_lr()[source]#

Retrieve the learning rate of each parameter group.

返回类型

list[float]

load_state_dict(state_dict)[source]#

加载调度器的状态。

参数

state_dict (dict) – scheduler state. Should be an object returned from a call to state_dict().

state_dict()[source]#

返回调度器状态,作为一个 dict

它包含 self.__dict__ 中除优化器之外的所有变量的条目。

返回类型

dict[str, Any]

step(epoch=None)[source]#

执行一步。