- (20240926) Paper is accepted by IJRR.
- (20240713) Provide usage example of running SLAM and calibration algorithms with our dataset.
- (20240508) Groundtruth poses of all vehicle-related sequences are postprocessed: eliminate poses characterized by high uncertainty.
- (20240422) Data can be downloaded from Baidu Wang Pan with the code byj8.
- (20240414) All sequences, ground-truth trajectories, and ground-truth maps have been publicly released. If you find issues of GT trajectories and maps, please contact us or report here.
- (20240413) A small simulated navigation environment is provied.
- (20240408) The development tool has been initially released.
- (20240407) Data can be downloaded from Google Drive.
- Please visit FusionPortable dataset and FusionPortableV2_dataset to check and download data.
- Download the compressed rosbag.
- When finished, use the similar command
7z l 20220216_garden_day.7z
to extract data.
Clone the Repo only
git clone https://github.com/fusionportable/fusionportable_dataset_tools.git
Clone the Repo with submodules (including calibration_files and algorithms for experiments)
git clone https://github.com/fusionportable/fusionportable_dataset_tools.git --recursive
Setup the Python environment (tested on Python-3.9) or using Anaconda directly and run
cd fusionportable_dataset_tools
conda create -n fp_dataset python=3.9.18
pip install -r requirements.txt
Integrate with your project
import os
import sys
sys.path.append('/path/fusionportable_dataset_tools')
Data-Loader-Related Functions
-
Convert the raw rosbag into individual files:
write_bag_to_data.ipynb
-
Convert the algorithms' results (i.e., R3LIVE, FAST-LIO2) stored as the rosbag into individual files:
write_alg_bag_to_data.ipynb
-
Convert the raw files into the KITTI-360 format (including synchronized sensor data, callibration files, odometry):
write_data_to_kitti360.ipynb
-
Generate depth map with respect to the frame_left camera for the depth evaluation purpose:
write_depthmap_to_kitti360.ipynb
Tool Functions
- Project undistorted point cloud onto image to verify the error in extrinsics:
visualize_depthmap.ipynb
Evaluation Tools
-
Trajectory Evaluation
-
Mapping Evaluation: click this link to try.
We have provided configuration files of running experiments with our dataset
SLAM
- Visual SLAM: DROID-SLAM
- Visual-Inertial SLAM: VINS-Fusion
- LiDAR-Inertial SLAM: FAST-LIO2
Others
- Face and Vehicle Number Pravicy Protection: Anonymizer
Intrinsic Calibration
- IMU noise calibration: Allen Variance Analysis
- Wheel enoder calibration: encoer_calc
Extrinsic Calibration
File ~/anaconda3/envs/fp_dataset/lib/python3.9/site-packages/ros_numpy/point_cloud2.py:224
221 new_cloud_arr[field_name] = cloud_arr[field_name]
...
AttributeError: module 'numpy' has no attribute 'float'.
Solution: Please goto path_to_python/site-packages/ros_numpy/point_cloud2.py:224
and replace the original line with
def get_xyz_points(cloud_array, remove_nans=True, dtype=np.float64):
Please refer to Contribution Guidance to make contributions to this project.
Please post issues or contact Dr.Jianhao Jiao (jiaojh1994 at gmail.com) or Mr.Hexiang Wei (hweiak at connect.ust.hk) if you have any questions.
Your are also recommended to check our paper first: FusionPortable V1 and FusionPortable V2.
If you find this paper or the toolbox useful in your project, please consider citing one of our papers.
@article{wei2024fusionportablev2,
title={Fusionportablev2: A unified multi-sensor dataset for generalized slam across diverse platforms and scalable environments},
author={Wei, Hexiang and Jiao, Jianhao and Hu, Xiangcheng and Yu, Jingwen and Xie, Xupeng and Wu, Jin and Zhu, Yilong and Liu, Yuxuan and Wang, Lujia and Liu, Ming},
journal={The International Journal of Robotics Research},
pages={02783649241303525},
year={2024},
publisher={SAGE Publications Sage UK: London, England}
}
@inproceedings{jiao2022fusionportable,
title={Fusionportable: A multi-sensor campus-scene dataset for evaluation of localization and mapping accuracy on diverse platforms},
author={Jiao, Jianhao and Wei, Hexiang and Hu, Tianshuai and Hu, Xiangcheng and Zhu, Yilong and He, Zhijian and Wu, Jin and Yu, Jingwen and Xie, Xupeng and Huang, Huaiyang and others},
booktitle={2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
pages={3851--3856},
year={2022},
organization={IEEE}
}