Skip to content

fusionportable/fusionportable_dataset_tools

Repository files navigation

FusionPortable_dataset_tools

News ⭐

  • (20240926) Paper is accepted by IJRR.
  • (20240713) Provide usage example of running SLAM and calibration algorithms with our dataset.
  • (20240508) Groundtruth poses of all vehicle-related sequences are postprocessed: eliminate poses characterized by high uncertainty.
  • (20240422) Data can be downloaded from Baidu Wang Pan with the code byj8.
  • (20240414) All sequences, ground-truth trajectories, and ground-truth maps have been publicly released. If you find issues of GT trajectories and maps, please contact us or report here.
  • (20240413) A small simulated navigation environment is provied.
  • (20240408) The development tool has been initially released.
  • (20240407) Data can be downloaded from Google Drive.

Download Dataset 🔥

  1. Please visit FusionPortable dataset and FusionPortableV2_dataset to check and download data.
  2. Download the compressed rosbag.
  3. When finished, use the similar command 7z l 20220216_garden_day.7z to extract data.

Installation

Clone the Repo only

git clone https://github.com/fusionportable/fusionportable_dataset_tools.git

Clone the Repo with submodules (including calibration_files and algorithms for experiments)

git clone https://github.com/fusionportable/fusionportable_dataset_tools.git --recursive

Setup the Python environment (tested on Python-3.9) or using Anaconda directly and run

cd fusionportable_dataset_tools
conda create -n fp_dataset python=3.9.18
pip install -r requirements.txt

Integrate with your project

import os
import sys
sys.path.append('/path/fusionportable_dataset_tools')

Usage

Parse ROS bag and Process Data 😍

Data-Loader-Related Functions

  1. Convert the raw rosbag into individual files: write_bag_to_data.ipynb

  2. Convert the algorithms' results (i.e., R3LIVE, FAST-LIO2) stored as the rosbag into individual files: write_alg_bag_to_data.ipynb

  3. Convert the raw files into the KITTI-360 format (including synchronized sensor data, callibration files, odometry): write_data_to_kitti360.ipynb

  4. Generate depth map with respect to the frame_left camera for the depth evaluation purpose: write_depthmap_to_kitti360.ipynb

Tool Functions

  1. Project undistorted point cloud onto image to verify the error in extrinsics: visualize_depthmap.ipynb

Evaluation Tools

  1. Trajectory Evaluation

  2. Mapping Evaluation: click this link to try.

Applications 😘

We have provided configuration files of running experiments with our dataset

SLAM

  1. Visual SLAM: DROID-SLAM
  2. Visual-Inertial SLAM: VINS-Fusion
  3. LiDAR-Inertial SLAM: FAST-LIO2

Others

  1. Face and Vehicle Number Pravicy Protection: Anonymizer

Calibration Tools 😚

Intrinsic Calibration

  1. IMU noise calibration: Allen Variance Analysis
  2. Wheel enoder calibration: encoer_calc

Extrinsic Calibration

  1. Camera-LiDAR calibration: LCECalib
  2. Camera-IMU, Multi-IMU calibration: Kalibr

Issues with Dependencies

1. Something wrong with the ros_numpy
File ~/anaconda3/envs/fp_dataset/lib/python3.9/site-packages/ros_numpy/point_cloud2.py:224
    221             new_cloud_arr[field_name] = cloud_arr[field_name]
...
AttributeError: module 'numpy' has no attribute 'float'.

Solution: Please goto path_to_python/site-packages/ros_numpy/point_cloud2.py:224 and replace the original line with

def get_xyz_points(cloud_array, remove_nans=True, dtype=np.float64):

Contribution

Please refer to Contribution Guidance to make contributions to this project.

Inquiry ❓

Please post issues or contact Dr.Jianhao Jiao (jiaojh1994 at gmail.com) or Mr.Hexiang Wei (hweiak at connect.ust.hk) if you have any questions.

Your are also recommended to check our paper first: FusionPortable V1 and FusionPortable V2.

Citation

If you find this paper or the toolbox useful in your project, please consider citing one of our papers.

@article{wei2024fusionportablev2,
  title={Fusionportablev2: A unified multi-sensor dataset for generalized slam across diverse platforms and scalable environments},
  author={Wei, Hexiang and Jiao, Jianhao and Hu, Xiangcheng and Yu, Jingwen and Xie, Xupeng and Wu, Jin and Zhu, Yilong and Liu, Yuxuan and Wang, Lujia and Liu, Ming},
  journal={The International Journal of Robotics Research},
  pages={02783649241303525},
  year={2024},
  publisher={SAGE Publications Sage UK: London, England}
}
@inproceedings{jiao2022fusionportable,
  title={Fusionportable: A multi-sensor campus-scene dataset for evaluation of localization and mapping accuracy on diverse platforms},
  author={Jiao, Jianhao and Wei, Hexiang and Hu, Tianshuai and Hu, Xiangcheng and Zhu, Yilong and He, Zhijian and Wu, Jin and Yu, Jingwen and Xie, Xupeng and Huang, Huaiyang and others},
  booktitle={2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={3851--3856},
  year={2022},
  organization={IEEE}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published