- Follow INSTALL.md to install all required libraries.
- Tensorflow
- Waymo-open-dataset devkit
conda activate centerformer
pip install waymo-open-dataset-tf-2-6-0==1.4.3
# For Waymo Dataset
└── WAYMO_DATASET_ROOT
├── tfrecord_training
├── tfrecord_validation
├── tfrecord_testing
Convert the tfrecord data to pickle files.
# train set
CUDA_VISIBLE_DEVICES=-1 python det3d/datasets/waymo/waymo_converter.py --record_path 'WAYMO_DATASET_ROOT/tfrecord_training/*.tfrecord' --root_path 'WAYMO_DATASET_ROOT/train/'
# validation set
CUDA_VISIBLE_DEVICES=-1 python det3d/datasets/waymo/waymo_converter.py --record_path 'WAYMO_DATASET_ROOT/tfrecord_validation/*.tfrecord' --root_path 'WAYMO_DATASET_ROOT/val/'
# testing set
CUDA_VISIBLE_DEVICES=-1 python det3d/datasets/waymo/waymo_converter.py --record_path 'WAYMO_DATASET_ROOT/tfrecord_testing/*.tfrecord' --root_path 'WAYMO_DATASET_ROOT/test/'
Create a symlink to the dataset root
mkdir data && cd data
ln -s WAYMO_DATASET_ROOT Waymo
Remember to change the WAYMO_DATASET_ROOT to the actual path in your system.
# One Sweep Infos
python tools/create_data.py waymo_data_prep --root_path=data/Waymo --split train --nsweeps=1
python tools/create_data.py waymo_data_prep --root_path=data/Waymo --split val --nsweeps=1
python tools/create_data.py waymo_data_prep --root_path=data/Waymo --split test --nsweeps=1
# Two Sweep Infos
python tools/create_data.py waymo_data_prep --root_path=data/Waymo --split train --nsweeps=2
python tools/create_data.py waymo_data_prep --root_path=data/Waymo --split val --nsweeps=2
python tools/create_data.py waymo_data_prep --root_path=data/Waymo --split test --nsweeps=2
# More Sweep Infos etc.
In the end, the data and info files should be organized as follows
└── CenterFormer
└── data
└── Waymo
├── tfrecord_training
├── tfrecord_validation
├── train <-- all training frames and annotations
├── val <-- all validation frames and annotations
├── test <-- all testing frames and annotations
├── infos_train_01sweeps_filter_zero_gt.pkl
├── infos_train_02sweeps_filter_zero_gt.pkl
├── infos_val_01sweeps_filter_zero_gt.pkl
├── infos_val_02sweeps_filter_zero_gt.pkl
├── infos_test_01sweeps_filter_zero_gt.pkl
├── infos_test_02sweeps_filter_zero_gt.pkl
├── ...
Use the following command to start a distributed training using 4 GPUs. The models and logs will be saved to work_dirs/CONFIG_NAME
.
python -m torch.distributed.launch --nproc_per_node=4 ./tools/train.py CONFIG_PATH
For distributed testing with 4 gpus,
python -m torch.distributed.launch --nproc_per_node=4 ./tools/dist_test.py CONFIG_PATH --work_dir work_dirs/CONFIG_NAME --checkpoint work_dirs/CONFIG_NAME/latest.pth
For testing with one gpu and see the inference time,
python ./tools/dist_test.py CONFIG_PATH --work_dir work_dirs/CONFIG_NAME --checkpoint work_dirs/CONFIG_NAME/latest.pth --speed_test
This will generate a my_preds.bin
file in the work_dir. You can create submission to Waymo server using waymo-open-dataset code by following the instructions here.
If you want to do local evaluation (e.g. for a subset), generate the gt prediction bin files using the script below and follow the waymo instructions here.
python det3d/datasets/waymo/waymo_common.py --info_path data/Waymo/infos_val_01sweeps_filter_zero_gt.pkl --result_path data/Waymo/ --gt
Add the --testset
flag to the end.
python ./tools/dist_test.py CONFIG_PATH --work_dir work_dirs/CONFIG_NAME --checkpoint work_dirs/CONFIG_NAME/latest.pth --testset