Important
Let the robots freely roam in the countryside!
ROUGH Dataset contains several hours of driving with mid-sized robots in challenging terrain. The ultimate goal is to provide recordings of traversals on (or through) flexible natural obstacles like tall grass, hanging tree branches, mud, dense undergrowth etc.
Datasets like RUGD, BotanicGarden or RELLIS-3D provide data mostly with the aim of training image/pointcloud segmentation networks or tuning SLAM algorithms.
ROUGH aims directly at learning how to drive through the challenging terrain.
Having correct segmentation is not sufficient for successful traversal - also the robot dynamics, terrain properties and control algorithm need to be taken into account. For example, 5 cm deep mud might be a stopping condition for a robot with 10 cm wheels or tracks, while it will be almost unnoticeable for a 5 m tank. Simply, knowing the semantic class mud
is just a part of the story.
The majority of ROUGH sequences were collected with shape-changing tracked robots, which allows capturing a much larger range of dynamics responses - simply moving the auxiliary tracks changes the center and moments of inertia and the contact surface. To correctly utilize the effects of dynamics, ROUGH provides not only the sensory data, but also robot models with dynamics properties.
A great application for ROUGH is self-supervised training of PINNs (Physics-Informed Neural Networks) on trajectory data, as in MonoForce. Can you guess a robot's future trajectory given just a single onboard image, robot model and control algorithm? MonoForce can do that!
Caution
The dataset is still in preparation and will be submitted to an IEEE journal. Hopefully in Nov '24. Click 'Watch' on top of this page to get project updates.
In the meantime, have a look at the sample data for MonoForce which look similar to what will be available in ROUGH (format documented here). The 3D kinematic and dynamic models of the robots are not yet available.
The robots used in ROUGH have almost identical sensory equipment. This is just a list of the platforms, and the sensors description follows in the next section.
Absolem is a custom-built tracked robot with 2 main tracks and 4 independently controlled auxiliary tracks. The first revision of the robot came to the world in 2011 in NIFTi project, then it got HW upgrade for TRADR project in 2015, and in 2019 it got another upgrade for the DARPA SubT Challenge.
Stats
Weight | Width | Length | Height | Main Motors | Aux Motors |
---|---|---|---|---|---|
40 kg | 60 cm | 67 cm | 100 cm | 2x 210 W (30 Nm) | 4x 210 W (30 Nm) |
MARV is a commercially available (yet prototype) agile tracked robot with 4 independently controlled tracks. It was designed in 2020 for the needs of VRAS research group and qualified for DARPA SubT Challenge.
Stats
Weight | Width | Length | Height | Main Motors | Aux Motors |
---|---|---|---|---|---|
70 kg | 60 cm | 60 cm | 100 cm | 200 W (25 Nm) | 200 W (180 Nm) |
Sensor | Type | Rate | Resolution | Sync | Timestamp accuracy | Topics |
---|---|---|---|---|---|---|
Ouster OS0-128 | 3D lidar | 10 Hz | 1024x10 | Trigger source | 20 µs | /points_filtered_* |
Ouster IMU | IMU | 100 Hz | 0.008 °/s, 61 µg | HW from lidar | 20 µs | /os_cloud_node/imu |
Basler a2A1920-51gcPRO + 4 mm lens | RGB camera | 10 Hz | 1920x1200 | HW from lidar | 200 µs | /camera_(front|left|rear|right|up)/image_color/compressed |
Basler acA2040-35gc + 1.8 mm fisheye lens | RGB camera | 30 Hz | 2048x1536 | No sync | Unknown | /camera_fisheye_(front|rear)/compressed |
Luxonis OAK-D Pro | RGB-D camera | 30 Hz | 1920x1080 RGB, 800x480 depth | No sync | 200 µs | /oak/rgb/image_raw/compressed , /oak/stereo/image_raw/compressedDepth |
Luxonis OAK-D Pro IMU | IMU | 100 Hz | ??? | No sync | 200 µs | /oak/imu/data , /oak/imu/mag |
Movella MTI-30 | IMU | 100 Hz | ??? | No sync | Unknown | /imu/data , /imu/mag , /imu_unbiased/data (removed gyro bias), /imu/mag_unbiased (removed internal hard-iron and soft-iron bias) |
Emlid Reach M+ with helical 3-band antenna | 1-band GNSS | 5 Hz | ??? | No sync | Unknown | /fix , /fix_status , /llh |
Septentrio Mosaic-go Heading Evaluation Kit + 2x helical 3-band antenna | 2-band 2-antenna GNSS | 10 Hz | ??? | No sync | Unknown | /gnss/septentrio/* |
The data in ROUGH dataset are provided under the Open Data Commons Open Database License (ODbL) v1.0. It is similar to CC-BY-SA, but more suitable for datasets. It explicitly states that works just using the data (and not creating another dataset) do not need to be released under the same license. In any case, you are obliged to attribute the use of the dataset.
The dataset was created by Martin Pecka, Bedřich Himmel, Valentýn Číhala, Ruslan Agishev, Karel Zimmermann and Tomáš Svoboda, all members of the VRAS group @ Czech Technical University in Prague.
This work was co-funded by the European Union under the project Robotics and Advanced Industrial Production (reg. no. CZ.02.01.01/00/22 008/0004590) and by the Czech Science Foundation under Project 24-12360S.