Skip to content

Official code for our paper "Harnessing Uncertainty-aware Bounding Boxes for Unsupervised 3D Object Detection".

License

Notifications You must be signed in to change notification settings

Ruiyang-061X/UA3D

Repository files navigation

Harnessing Uncertainty-aware Bounding Boxes for Unsupervised 3D Object Detection

by Ruiyang Zhang, Hu Zhang, Hang Yu, Zhedong Zheng

Motivation

Abstract

Unsupervised 3D object detection aims to identify objects of interest from unla- beled raw data, such as LiDAR points. Recent approaches usually adopt pseudo 3D bounding boxes (3D bboxes) from clustering algorithm to initialize the model training. However, pseudo bboxes inevitably contain noise, and such inaccuracies accumulate to the final model, compromising the performance. Therefore, in an attempt to mitigate the negative impact of inaccurate pseudo bboxes, we intro- duce a new uncertainty-aware framework for unsupervised 3D object detection, dubbed UA3D. In particular, our method consists of two phases: uncertainty es- timation and uncertainty regularization. (1) In the uncertainty estimation phase, we incorporate an extra auxiliary detection branch alongside the original primary detector. The prediction disparity between the primary and auxiliary detectors could reflect fine-grained uncertainty at the box coordinate level. (2) Based on the assessed uncertainty, we adaptively adjust the weight of every 3D bbox co- ordinate via uncertainty regularization, refining the training process on pseudo bboxes. For pseudo bbox coordinate with high uncertainty, we assign a relatively low loss weight. Extensive experiments verify that the proposed method is robust against the noisy pseudo bboxes, yielding substantial improvements on nuScenes and Lyft compared to existing approaches, with increases of +6.9% APBEV and +2.5% AP3D on nuScenes, and +4.1% APBEV and +2.0% AP3D on Lyft.

Environment

Harnessing Uncertainty-aware Bounding Boxes for Unsupervised 3D Object Detection (UA3D)

nuScenes

conda activate UA3D; bash scripts/seed_training_nuscenes.sh; bash scripts/self_training_nusc.sh -C "data_paths=nusc.yaml det_filtering.pp_score_threshold=0.7 det_filtering.pp_score_percentile=20 data_paths.bbox_info_save_dst=null calib_path=$(pwd)/downstream/OpenPCDet/data/nuscenes_boston/training/calib ptc_path=$(pwd)/downstream/OpenPCDet/data/nuscenes_boston/training/velodyne image_shape=[900,1600]"

Lyft

conda activate UA3D; bash scripts/seed_training_lyft.sh; bash scripts/self_training_lyft.sh -C "det_filtering.pp_score_threshold=0.7 det_filtering.pp_score_percentile=20 data_paths.bbox_info_save_dst=null data_root=$(pwd)/downstream/OpenPCDet/data/lyft/training";

Evaluation

nuScenes

conda activate UA3D; cd downstream/OpenPCDet/tools; bash scripts/dist_test.sh 4 --cfg_file ../../downstream/OpenPCDet/tools/cfgs/nuscenes_boston_models/pointrcnn_dynamic_obj.yaml --ckpt PATH_TO_CKPT

Lyft

conda activate UA3D; cd downstream/OpenPCDet/tools; bash scripts/dist_test.sh 4 --cfg_file ../../downstream/OpenPCDet/tools/cfgs/lyft_models/pointrcnn_dynamic_obj.yaml --ckpt PATH_TO_CKPT

Checkpoints

nuScenes experiments

Model ST rounds Checkpoint Config file
PointRCNN 0 link cfg
PointRCNN 1 link cfg
PointRCNN 10 link cfg

Lyft experiments

Model ST rounds Checkpoint Config file
PointRCNN 0 link cfg
PointRCNN 1 link cfg
PointRCNN 10 link cfg

Core Codes

License

This project is under the MIT License.

Contact

Please open an issue if you have any questions about using this repo.

Acknowledgement

Our repo is based on MODEST(CVPR'22), OpenPCDet. Thanks for their great works and open-source effort!

Citation

@inproceedings{zhang2024harnessing,
  title={Harnessing Uncertainty-aware Bounding Boxes for Unsupervised 3D Object Detection},
  author={Zhang, Ruiyang and Zhang, Hu and Yu, Hang and Zheng, Zhedong},
  booktitle={Arxiv},
  year={2024}
}

About

Official code for our paper "Harnessing Uncertainty-aware Bounding Boxes for Unsupervised 3D Object Detection".

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published