The UZH-FPV Drone Racing Dataset:

High-speed, Aggressive 6DoF Trajectories for State Estimation and Drone Racing

UZH-FPV Drone Racing Dataset

We introduce the UZH-FPV Drone Racing dataset, which is the most aggressive visual-inertial odometry dataset to date. Large accelerations, rotations, and apparent motion in vision sensors make aggressive trajectories difficult for state estimation. However, many compelling applications, such as autonomous drone racing, require high speed state estimation, but existing datasets do not address this. These sequences were recorded with a first-person-view (FPV) drone racing quadrotor fitted with sensors and flown aggressively by an expert pilot. The trajectories include fast laps around a racetrack with drone racing gates, as well as free-form trajectories around obstacles, both indoor and out. We present the camera images and IMU data from a Qualcomm Snapdragon Flight board, ground truth from a Leica Nova MS60 laser tracker, as well as event data from an mDAVIS 346 event camera, and high-resolution RGB images from the pilot's FPV camera. With this dataset, our goal is to help advance the state of the art in high speed state estimation.


Citing

When using the data in an academic context, please cite the following paper.
Information field illustration

J. Delmerico, T. Cieslewski, H. Rebecq, M. Faessler, D. Scaramuzza

Are We Ready for Autonomous Drone Racing? The UZH-FPV Drone Racing Dataset

IEEE International Conference on Robotics and Automation (ICRA), 2019.

PDF YouTube Project Webpage and Datasets


Visualization of Dataset Sequences



Dataset Format

We provide all datasets in two formats: text files [Coming Soon] and binary files (rosbag). While their content is identical, some of them are better suited for particular applications. The binary rosbag files are intended for users familiar with the Robot Operating System (ROS) and for applications that are intended to be executed on a real system.



Binary Files (rosbag)

The rosbag files contain images and IMU measurements using the standard sensor_msgs/Image and sensor_msgs/Imu message types, respectively. The events are provided as dvs_msgs/EventArray message types, and the ground truth is provided as geometry_msgs/PoseStamped messages. The Events/IMU/GT bag files also contain the image frames from the mDAVIS as sensor_msgs/Image messages.


Datasets

Coming soon



Calibration

We provide the calibration parameters for the camera intrinsics and camera-IMU extrinsics in YAML format, as well as the raw calibration sequences used to produce those with the Kalibr toolbox.


Coming soon




License

This datasets are released under the Creative Commons license (CC BY-NC-SA 3.0), which is free for non-commercial use (including research).

Acknowledgements

This work was supported by the National Centre of Competence in Research Robotics (NCCR) through the Swiss National Science Foundation, the SNSF-ERC Starting Grant, and the DARPA FLA Program.

This work would not have been possible without the assistance of Stefan Gächter, Zoltan Török, and Thomas Mörwald of Leica Geosystems and their support in gathering our data. Additional thanks go to Innovation Park Zürich, and the Fässler family for providing experimental space, and iniVation AG and Prof. Tobi Delbruck for their support and guidance with the mDAVIS sensors.