简体   繁体   中英

How to adjust the learning rate after N number of epochs?

I am using Hugginface's Trainer. How to adjust the learning rate after N number of epochs? For example, I have an initial learning rate set to lr=2e-6 , and I would like to change the learning rate to lr=1e-6 after the first epoch and stay on it the rest of the training.

I tried this so far:

optimizer = AdamW(model.parameters(),
              lr = 2e-5,
              eps = 1e-8
            )

epochs = 5
batch_number = len(small_train_dataset) / 8
total_steps = batch_number * epochs


scheduler = get_linear_schedule_with_warmup(optimizer, 
                                            num_warmup_steps = 0,
                                            num_training_steps = total_steps,
                                            last_epoch=-1
                                            )

I know that there is https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.LambdaLR.html#torch.optim.lr_scheduler.LambdaLR but here it drops learning rate every epoch but that is not what i want to do. I want it to drop after 1 epoch and then stay on it rest of the training process.

Try to use scheduler like this:

scheduler = get_constant_schedule_with_warmup(optimizer, num_warmup_steps = 0)

This will increase your lr from 0 to initial_lr specified in your optimizer in num_warmup_steps , after which it becomes constant.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM