Skip to content

[ICLR 2024] Official pytorch implementation of "Denoising Task Routing for Diffusion Models"

License

Notifications You must be signed in to change notification settings

byeongjun-park/DTR

Repository files navigation

Denoising Task Routing for Diffusion Models

This repository contains the official pytorch implementation of the paper: "Denoising Task Routing for Diffusion Models". In this repository, we release codes for the improved version of DiT with the task routing. To gain a better understanding of MTL perspective in diffusion modes, please visit the project page of ANT.

DTR

DTR creates different pathways for each denoising task (step).

DTR is:

💡 Simple yet Effective (Performance gain is significant!)

💡 Boosts convergence speed

💡 No additional parameters

💡 Plug-and-Play (Easily integrates into any diffusion architecture)

💡 Minimal Code (Implemented with just 10+ lines of code)


Golden Retriever

Generated sample (golden retriever) from 256x256 DiT-L/2 + DTR (w/ cfg=2.0).


Updates

  • 2024.02.16: Upload Project Page.
  • 2023.12.26: Initial Release.

Todo

  • Project Pages
  • Upload checkpoints

Setup

PyTorch Config: Hydra

We use 8 80GB A100 GPUs for all experiments.

python3 -m pip install -r requirements.txt

Training DiT with DTR

We provide an example training script for ImageNet.

torchrun --nnodes=1 --nproc_per_node=8 train.py general.data_path='<PATH_TO_DATASET>'

You can also modify the DiT model, optimization type, sharing ratio, etc.

torchrun --nnodes=1 --nproc_per_node=8 train.py \
general.data_path='<PATH_TO_DATASET>' \
general.loss_weight_type="uw" \
models.name="DiT-L/2" \
models.routing.sharing_ratio=0.8

Sampling DiT with DTR

After training, the checkpoint and log files are saved based on the configuration. Consequently, you need to execute the sampling script using the same configuration as the training script. Additionally, you can adjust the number of sampling images and the classifier-guidance scale.

torchrun --nnodes=1 --nproc_per_node=8 sample_ddp.py \
general.loss_weight_type="uw \
models.name="DiT-L/2" \
models.routing.sharing_ratio=0.8 \
eval.cfg_scale=1.5 \
eval.num_fid_samples=50000

Please refer to the example scripts for detailed instructions how to reproduce our results. In this script, we enumerate the configurations that can be modified if needed.

Results

With DiT-L/2, our DTR is compatible with MTL optimization techniques specifically designed for diffusion models.

Quantitative Results (guidance scale = 1.5)

Optimization (+ DTR) FID-50K Inception Score Precision Recall
Vanilla 12.59 134.60 0.73 0.49
Vanilla + DTR 8.90 156.48 0.77 0.51
Min-SNR 9.58 179.98 0.78 0.47
Min-SNR + DTR 8.24 186.02 0.79 0.50
ANT-UW 5.85 206.68 0.84 0.46
ANT-UW + DTR 4.61 208.76 0.84 0.48

Results w.r.t. guidance scale

result

BibTeX

@article{park2023denoising,
  title={Denoising Task Routing for Diffusion Models},
  author={Park, Byeongjun and Woo, Sangmin and Go, Hyojun and Kim, Jin-Young and Kim, Changick},
  journal={arXiv preprint arXiv:2310.07138},
  year={2023}
}

Acknowledgments

This codebase borrows from most notably DIT and ANT.

About

[ICLR 2024] Official pytorch implementation of "Denoising Task Routing for Diffusion Models"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Contributors 4

  •  
  •  
  •  
  •