Skip to content

A method for assigning separate learning rate schedulers to different parameters group in a model.

License

Notifications You must be signed in to change notification settings

kardasbart/MultiLR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

MultiLR

A method for assigning separate learning rate schedulers to different parameters groups in a model. Pull requests are welcome.

Usage

Write a lambda function that constructs a scheduler for each parameter group.

scheduler = MultiLR(optimizer, 
                [lambda opt: torch.optim.lr_scheduler.StepLR(opt, step_size=10, gamma=0.5), 
                 lambda opt: torch.optim.lr_scheduler.LinearLR(opt, start_factor=0.25, total_iters=10)])

About

A method for assigning separate learning rate schedulers to different parameters group in a model.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages