The official implementation of "Locally Hierarchical Auto-Regressive Modeling for Image Generation"
- Tackgeun You, Saehoon Kim, Chiheon Kim, Doyup Lee, Bohyung Han, (NeurIPS 2022)
We have tested our codes on the environment below
Python 3.7.10
/Pytorch 1.10.0
/torchvision 0.10.0
/CUDA 11.3
/Ubuntu 18.04
.
Please run the following command to install the necessary dependencies
pip install -r requirements.txt
- Implementation of HQ-VAE and HQ-Transformer
- Pretrained checkpoints of HQ-VAE and HQ-Transformer
- Training pipeline of HQ-VAE
- Image generation and its evaluation pipeline of HQ-VAE and HQ-Transformer
Refer the jupyter notebook script.
Experiment commands and configurations are described here in experiment commands. We provide pretrained checkpoints of HQ-VAE and HQ-Transformers to reproduce the main results in the paper.
@inproceedings{you2022hqtransformer,
title={Locally Hierarchical Auto-Regressive Modeling for Image Generation},
author={You, Tackgeun and Kim, Saehoon and Kim, Chiheon and Lee, Doyup and Han, Bohyung},
booktitle={Proceedings of the International Conference on Neural Information Processing Systems},
year={2022}
}
- MIT License.
Our implementation is based on rq-vae-transformer and minDALL-E. Our transformer-related implementation is inspired by minGPT. We appreciate the authors of VQGAN for making their codes available to public.