Skip to content

Latest commit

 

History

History
141 lines (97 loc) · 5.36 KB

README.md

File metadata and controls

141 lines (97 loc) · 5.36 KB

NSDP

Paper | arXiv | Video | Project Page

This is the repository that contains source code for the paper:

Neual Shape Deformation Priors (NeurIPS 2022 SpotLight).

  • For the task of shape manipulation, NSDP learns shape deformations via canonicalization :
  • We propose Transformer-based Deformation Networks (TDNets) for local deformation fields prediction :
  • Install all dependencies

    • Download the latest conda here.

    • To create a conda environment with all the required packages using conda run the following command:

    conda env create -f environment.yml
    

    The above command creates a conda environment with the name nsdp.

    • Compile external dependencies inside external directory by executing:
    conda activate nsdp
    ./build_external.sh
    

    The external dependencies are PyMarchingCubes, gaps and Eigen.

    • NSDP uses farthest point sampling (FPS) to downsample the input. Run
    pip install pointnet2_ops_lib/.
    

    in order to install the cuda implementation of FPS. Credits for this go to Erik Wijams's GitHub, from where the code was copied for convenience.

    Data Preparation

    In our paper, we mainly use the DeformingThing4D. Download the raw dataset firstly, and then convert the .anime files to mesh .obj files by running the script.

    cd ./preprocess
    bash convert_deform4d_anime_to_mesh.sh
    

    We normalize the meshes, sample surface point cloud trajectories (i.e. point clouds with one-to-one correspondences), and sample spatial point trajectories in 3D space (i.e. spreprocess_deform4d_seqpatial points with one-to-one correspondences). To do so, run the script.

    bash preprocess_deform4d_seq.sh
    

    We also use the animations of unseen identities from the Deformation Transfer as a test set. To prepare the processed dataset, you can run

    bash preprocess_deformtransfer_seq.sh
    

    To evaluate the user-specified handles used in the interactive editing applications, we use the meshes from TOSCA and reconstructed dog using the method of BARC. You can directly download the mesheds of TOSCA_animal and dog_barc_recon previously obtained by us, and then run the below scripts to get the normalized meshes.

    bash preprocess_nocorr_tosca.sh
    bash preprocess_nocorr_dogrec.sh
    

    Pretrained Models

    We provide the pretrained models of forward and backward deformation networks, and also the pretrained model of whole model after end-to-end finetuning.

    unzip pretrained.zip
    

    Training from Scratch

    The training is composed of two stages. In the first stage, we train the forward ana backward deformation networks separately using the scripts:

    python train.py config/deform4d/forward.yaml --with_wandb_logger
    python train.py config/deform4d/backward.yaml --with_wandb_logger
    

    In the second stage, we train the forward and backward deformation networks together to learn shape deformations between two arbitrary non-rigid poses. We need to load the pretrained forward/backward deformation models by modify the config['training']['weight_forward_file'] and config['training']['weight_backward_file'] in config/deform4d/arbitrary.yaml.

    python train.py config/deform4d/arbitrary.yaml --with_wandb_logger
    

    Evaluation

    To evaluate the pretrained model on unseen motions (S1) and unseen identities (S2) of DeformingThing4D, and unseen identies used in the Deformation Transfer you can run

    python test.py config/deform4d/arbitrary.yaml
    python test.py config/deform4d/arbitrary_unseen_iden.yaml
    python test.py config/deform4d/arbitrary_unseen_iden.yaml
    

    Then, you will get both quantitative and qualitative results.

    User-specified Handle Displacements

    To evaluate our approach on user-specified handles of unseen identites, you can run

    python run.py config/tosca/head.yaml
    python run.py config/tosca/tail.yaml
    python run.py config/tosca/behindrightfoot.yaml
    python run.py config/tosca/frontleftfoot.yaml
    python run.py config/dogrec/head.yaml
    python run.py config/dogrec/tail.yaml
    python run.py config/dogrec/behindrightfoot.yaml
    python run.py config/dogrec/frontleftfoot.yaml
    

    If you find NSDP useful for your work please cite:

    @inproceedings{
        tang2022neural,
        title={Neural Shape Deformation Priors},
        author={Tang, Jiapeng and Markhasin Lev and Wang Bi and Thies Justus and Nie{\ss}ner, Matthias},
        booktitle={Advances in Neural Information Processing Systems},
        year={2022},
        }
    

    Contact Jiapeng Tang for questions, comments and reporting bugs.