Skip to content

[ECCV'22] Official PyTorch Implementation of "HDR-Plenoxels: Self-Calibrating High Dynamic Range Radiance Fields"

License

Notifications You must be signed in to change notification settings

postech-ami/HDR-Plenoxels

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HDR-Plenoxels (ECCV 2022)

This repository is official implementation for the ECCV 2022 paper, HDR-Plenoxels: Self-Calibrating High Dynamic Range Radiance Fields.

HDR-Plenoxels is end-to-end HDR radiance fields learning method w/ only LDR images of the varying camera and w/o additional camera information (e.g., exposure value). We deign the tone-mapping module based on a physical camera pipeline. We also deploy a multi-view dataset containing varying camera conditions.

HDR-Plenoxels.mov

Setup

This code is based on Plenoxels official implementation. You have to follow setup detail of Plenoxels repository (below).

First create the virtualenv; we recommend using conda:

conda env create -f environment.yml
conda activate plenoxel

Then clone the repo and install the library at the root (svox2), which includes a CUDA extension.

If your CUDA toolkit is older than 11, then you will need to install CUB as follows: conda install -c bottler nvidiacub. Since CUDA 11, CUB is shipped with the toolkit.

To install the main library, simply run

pip install .

In the repo root directory.

Prepare Datasets

We deploy our HDR training dataset for LLFF format, and the dataset will be auto-detected.

Please get the synthetic and real LLFF datasets from this link.

Voxel Optimization (Training)

For training a single scene, see opt/hdr_opt.py.

You can train both of our synthetic and real HDR datasets. Inside opt/, run below shell scripts.

If you want to use synthetic datset, run below shell scripts.

# Plenoxels + static dataset
./shell/syn/train_mid.sh

# Plenoxels + varying datset
./shell/syn/train_mix.sh

# HDR-Plenoxes + varying datset
./shell/syn/train_tone.sh

If you want to use real datset, run below shell scripts.

# Plenoxels + static dataset
./shell/real/train_mid.sh

# Plenoxels + varying datset
./shell/real/train_mix.sh

# HDR-Plenoxes + varying datset
./shell/real/train_tone.sh

We do not provide pretrained checkpoints.

Evaluation

  • Use opt/shell/render/render_hdr.sh for rendering HDR radiance fields.

  • Use opt/shell/render/render_ldr.sh for rendering LDR radiance fields which is final output.

  • If you don't want to save all frames, which is very slow, add --no_imsave to avoid this.

Metric

Inside opt/, run

CUDA_VISIBLE_DEVICES=0 python hdr_calc_metrics.py
  • You can get PSNR, SSIM, and LPIPS scores for right-half novel views.

Citation

@inproceedings{jun2022hdr,
    title     = {HDR-Plenoxels: Self-Calibrating High Dynamic Range Radiance Fields},
    author    = {Jun-Seong, Kim and Yu-Ji, Kim and Ye-Bin, Moon and Oh, Tae-Hyun},
    booktitle = {ECCV},
    year      = {2022},
}

About

[ECCV'22] Official PyTorch Implementation of "HDR-Plenoxels: Self-Calibrating High Dynamic Range Radiance Fields"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published