Code for reproducing the experiments in the paper:
Daulbaev, T., Katrutsa, A., Markeeva, L., Gusak, J., Cichocki, A., & Oseledets, I. (2020). Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs. Advances in Neural Information Processing Systems, 33. [arxiv] [bibtex]
This code is based on the following repositories:
- https://github.com/rtqichen/torchdiffeq
- https://github.com/rtqichen/ffjord
- https://github.com/amirgholami/anode
- https://github.com/juliagusak/neural-ode-norm
python3 setup.py install
To apply IRDM, one has to create odeint_chebyshev function, which has the same interface as odeint_adjoint as follows.
from interpolated_torchdiffeq import odeint_chebyshev_func
from functools import partial
n_nodes = 10 # if you want 10 grid points
odeint_chebyshev = partial(odeint_chebyshev_func, n_nodes=n_nodes)
# ... And use odeint_chebyshev as odeint in torchdiffeq
Code for experiments is located in subfolders of ./experiments
. Please, see README files in these subfolders for instructions.
For logging, we use Weights & Biases. You can specify --wandb_name
to use wandb logging in all scripts.
Feel free to ask questions via authors' emails.