This repository contains code for range-aided localization. The code supports two different modes of sensor fusion. More details on the algorithm will be made available shortly.
If you use this repository in your research, please use the following citation:
@misc{goudar2023rangevisualinertial,
title={Range-Visual-Inertial Sensor Fusion for Micro Aerial Vehicle Localization and Navigation},
author={Abhishek Goudar and Wenda Zhao and Angela P. Schoellig},
year={2023},
eprint={2311.09056},
archivePrefix={arXiv},
primaryClass={cs.RO}
}
Not all features of the algorithm have been merged. We are in the process of updating the code base.
For installing necessary dependencies and building instructions please see installation.
Please ensure you have followed the necessary installation steps before proceeding.
The functionality of the code can be tested using a rosbag from the LSY Range-Aided Localization dataset.
-
Download the UTIAS_vicon_12122022 dataset.
-
Unzip the dataset.
-
Run the launch file
source ~/catkin_ws/devel/setup.bash
roslaunch ra_sam default.launch --screen
- Play the rosbag from dataset
source ~/catkin_ws/deve/setup.bash
rosbag play <path-to-dataset>/trial1/sensor_data.bag
- If everything works correctly, you should see the following visualization in Rviz:
To test with different rosbags from LSY Range-Aided Localization dataset, edit the default.launch
file insert the path of robot_config.yaml
for the corresponding dataset in the robot_config
field (highlighted below) and repeat the steps from the previous section.
- Odometry preintegration's covariance is approximated right now. Covariance prediction needs to be merged from the corresponding branch.