-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Multiple camera calibration
This tool estimates the intrinsic and extrinsic parameters of a multiple camera-system with non-shared overlapping fields of view. The image data is provided as a ROS bag containing image streams for all cameras. The calibration routine will go through all images and select images using information theoretic measures in order to get a good estimate of the system parameters. (see 1)
Arbitrary combinations of projection and distortion model can be mixed in one calibration run. Have a look at this page for a list of the supported camera and distortion models.
##How to use?
###1) Collect images Create a ROS bag containing the raw image streams either by directly recording from a ROS sensor stream or by using the bagcreater script on a sequence of image files.
It is recommended to lower the frequency of the camera streams to around 4 Hz while capturing the calibration data. This reduces the number of images to be processed by the calibrator containing almost redundant information and thus lower the runtime of the calibration.
###2) Running the calibration
The tool must be provided with the following input:
-
--bag filename.bag
ROS bag containing the data -
--topics TOPIC_0 ... TOPIC_N
list of all camera topics in the bag. matches the ordering of --models -
--models MODEL_0 ... MODEL_N
list of camera/distortion models to be fitted. matches the ordering of --topics. (see Supported models) -
--target target.yaml
the calibration target configuration (see Cailbration targets)
Note that the ordering of the topics and camera/distortion models match and determine the internal camera number in the calibration.
The calibration can then be run using:
kalibr_calibrate_cameras --bag [filename.bag] --topics [TOPIC_0 ... TOPIC_N] --models [MODEL_0 ... MODEL_N] --target [target.yaml]
It can happen that the optimization diverges right after processing the first few images due to a bad initial guess on the focal lengths. In this case just try to restart the calibration as the initial guesses are based on a random pick of images.
###3) The output The calibration will produce the following output files:
- report-%BAGNAME%.pdf: Report in PDF format. Contains all plots for documentation.
- results-%BAGNAME%.txt: Result summary as a text file.
- chain.yaml: Results in YAML format. This file can be used as an input for the camera-imu calibrator. Please see the here for the used format.
###4) Optional live validation (ROS only) If your sensor provides live data on ROS topics the live validator can be used to verify the calibration on live streams. Please refer to this page on how to use this tool.
##An example run using a sample dataset Download the sample dataset here and extract it. The archive will contain the bag-file and the calibration target configuration file.
IMAGE OF CAMERA ARRANGEMENT
The dataset was recorded with the sensor system shown in the picture above. It contains four cameras which should be calibrated using the following models:
- cam0, cam1: pinhole projection / equidistant distortion
- cam2, cam3: omni projection / radial-tangential distortion
kalibr_calibrate_cameras --models pinhole-equi pinhole-equi omni-radtan omni-radtan --topics /cam0/image_raw /cam1/image_raw /cam2/image_raw /cam3/image_raw --bag dataset_mcc.bag --target aprilgrid_6x6.yaml
Please cite the appropriate papers when using this toolbox or parts of it in an academic publication.
Multiple camera calibration
Camera-IMU calibration
Multi-IMU and IMU intrinsic calibration
Rolling Shutter camera calibration
(only ROS):
Camera focus
Calibration validator
ROS2 support
Supported camera models
Calibration targets
Bag format
YAML formats
IMU Noise Model
Example: Calibrating a VI-Sensor
Example: Calibrating RealSense Cameras