Zehao Yu · Torsten Sattler · Andreas Geiger
Paper | arXiv | Project Page
Gaussian Opacity Fields (GOF) enables geometry extraction with 3D Gaussians directly by indentifying its level set. Our regularization improves surface reconstruction and we utilize Marching Tetrahedra for adaptive and compact mesh extraction.
- [2024.06.10]: 🔥 Improve the training speed by 2x with merged operations. 6 scenes in TNT dataset can be trained in ~24 mins and the bicycle scene in the Mip-NeRF 360 dataset can be trained in ~45 mins. Please pull the latest code and reinstall with
pip install submodules/diff-gaussian-rasterization
to use it.
Clone the repository and create an anaconda environment using
git clone git@github.com:autonomousvision/gaussian-opacity-fields.git
cd gaussian-opacity-fields
conda create -y -n gof python=3.8
conda activate gof
pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 -f https://download.pytorch.org/whl/torch_stable.html
conda install cudatoolkit-dev=11.3 -c conda-forge
pip install -r requirements.txt
pip install submodules/diff-gaussian-rasterization
pip install submodules/simple-knn/
# tetra-nerf for triangulation
cd submodules/tetra-triangulation
conda install cmake
conda install conda-forge::gmp
conda install conda-forge::cgal
cmake .
# you can specify your own cuda path
# export CPATH=/usr/local/cuda-11.3/targets/x86_64-linux/include:$CPATH
make
pip install -e .
Please download the Mip-NeRF 360 dataset from the official webiste, the NeRF-Synthetic dataset from the NeRF's official Google Drive, the preprocessed DTU dataset from 2DGS, the proprocessed Tanks and Temples dataset from here. You need to download the ground truth point clouds from the DTU dataset and save to dtu_eval/Offical_DTU_Dataset
to evaluate the geometry reconstruction. For the Tanks and Temples dataset, you need to download the ground truth point clouds, alignments and cropfiles and save to eval_tnt/TrainingSet
, such as eval_tnt/TrainingSet/Caterpillar/Caterpillar.ply
.
# you might need to update the data path in the script accordingly
# NeRF-synthetic dataset
python scripts/run_nerf_synthetic.py
# Mip-NeRF 360 dataset
python scripts/run_mipnerf360.py
# Tanks and Temples dataset
python scripts/run_tnt.py
# DTU dataset
python scripts/run_dtu.py
We use the same data format from 3DGS, please follow here to prepare the your dataset. Then you can train your model and extract a mesh (we use the Tanks and Temples dataset for example)
# training
# -r 2 for using downsampled images with factor 2
# --use_decoupled_appearance to enable decoupled appearance modeling if your images has changing lighting conditions
python train.py -s TNT_GOF/TrainingSet/Caterpillar -m exp_TNT/Caterpillar -r 2 --use_decoupled_appearance
# extract the mesh after training
python extract_mesh.py -m exp_TNT/Caterpillar --iteration 30000
# you can open extracted mesh with meshlab or using the following script based on open3d
python mesh_viewer.py exp_TNT/Caterpillar/test/ours_30000/fusion/mesh_binary_search_7.ply
This project is built upon 3DGS and Mip-Splatting. Regularizations and some visualizations are taken from 2DGS. Tetrahedra triangulation is taken from Tetra-NeRF. Marching Tetrahdedra is adapted from Kaolin Library. Evaluation scripts for DTU and Tanks and Temples dataset are taken from DTUeval-python and TanksAndTemples respectively. We thank all the authors for their great work and repos.
If you find our code or paper useful, please cite
@article{Yu2024GOF,
author = {Yu, Zehao and Sattler, Torsten and Geiger, Andreas},
title = {Gaussian Opacity Fields: Efficient High-quality Compact Surface Reconstruction in Unbounded Scenes},
journal = {arXiv:2404.10772},
year = {2024},
}
If you find the regularizations useful, please kindly cite
@inproceedings{Huang2DGS2024,
title={2D Gaussian Splatting for Geometrically Accurate Radiance Fields},
author={Huang, Binbin and Yu, Zehao and Chen, Anpei and Geiger, Andreas and Gao, Shenghua},
publisher = {Association for Computing Machinery},
booktitle = {SIGGRAPH 2024 Conference Papers},
year = {2024},
doi = {10.1145/3641519.3657428}
}