简体   繁体   中英

PyTorch - How to get learning rate during training?

While training, I'd like to know the value of learning_rate. What should I do?

It's my code, like this:

my_optimizer = torch.optim.SGD(my_model.parameters(), 
                               lr=0.001, 
                               momentum=0.99, 
                               weight_decay=2e-3)

Thank you.

For only one parameter group like in the example you've given, you can use this function and call it during training to get the current learning rate:

def get_lr(optimizer):
    for param_group in optimizer.param_groups:
        return param_group['lr']

Alternatively, you may use an lr_scheduler along with your optimizer and simply call the built-in lr_scheduler.get_lr() method.

Here is an example:

my_optimizer = torch.optim.Adam( my_model.parameters(), 
                                 lr = 0.001, 
                                 weight_decay = 0.002)

my_lr_scheduler = torch.optim.lr_scheduler.StepLR( my_optimizer, 
                                                step_size = 50, 
                                                gamma = 0.1)

# train
...
my_optimizer.step()
my_lr_scheduler.step()

# get learning rate
my_lr = my_lr_scheduler.get_lr()
# or
my_lr = my_lr_scheduler.optimizer.param_groups[0]['lr']

The added benefit for using lr_scheduler is more controls on changing lr over time; lr_decay, etc. For lr_scheduler args, refer to pytorch docs .

Use optimizer.param_groups[-1]['lr']

As of PyTorch 1.13.0, one can access the the learning rate via the method scheduler.get_last_lr() . Said method can be found in the schedulers' base class LRScheduler ( See their code ). It actually returns the attribute scheduler._last_lr in the base class as Zahra has mentioned but calling the method should be more preferred.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM