This repository is the code for the paper: ActCooLR -- Understanding and Controlling Learning Rates by Tracking \ Activation Pattern Changes
- Make sure you have Pytorch (1.11.0) installed. (This code has been tested on Python 3.9.9).
- Install the requirements:
pip install --user -r requirements.txt
- Make sure that
visdom
is running on your machine:Go to http://localhost:8097mkdir logs visdom -env_path ./logs
In the following we give all steps required to reproduce the experiments shown in the figures of the paper.
Table Plots
- Run
./exp.sh --gpu 0 -d run table ""
- You can observe the progress using
./exp.sh --stats table ""
- Multiple GPUs can be used in parallel, just start multiple processes using
./exp.sh --gpu 1 ...
- Observe Training & obtain the data directly from visdom (go to http://localhost:8097)
LR Sensitivity Plots
- Run
./exp.sh --gpu 0 -d run paper-plot-lrsensitivity EXPNAME
. The experiment names are listed in./experiments/paper-plot-lrsensitivity.sh
.
- You can observe the progress using
./exp.sh --stats paper-plot-lrsensitivity EXPNAME
- Multiple GPUs can be used in parallel, just start multiple processes using
./exp.sh --gpu 1 ...
- Observe Training & obtain the data directly from visdom (go to http://localhost:8097)