News
August 2, 2022
New ECCV Paper: ESS: Learning Event-based Semantic Segmentation from Still Images
We are excited to announce our ECCV paper, which overcomes the lack of semantic segmentation datasets
for event cameras by directly transferring the semantic segmentation task from existing labeled
image datasets to unlabeled events. Our approach neither requires video data nor per-pixel alignment between images and events.
For more details, check out the paper, video, code, and dataset.
August 1, 2022
New Research Assistant
We warmly welcome Vincenzo Polizzi as a new research assistant in our lab!
July 31, 2022
RPG on the main German TV Kids program "1, 2 oder 3" on ZDF!

Leonard Bauersfeld and Elia Kaufmann were invited to the famous German TV program "1, 2 oder 3" to talk about drones. Watch the full video in the ZDF Mediathek
here (available until 28.08.2022). The part featuring RPG starts at 14:45.
Photo: ZDF/Ralf Wilschewski.
July 13, 2022
RPG on the main Italian TV science program SuperQuark on RAI1!
Watch the full video report about our research on autonomous drones, from drone racing to search and rescue, from standard to event cameras. The
video is in Italian with English subtitles.
July 7, 2022
First AI vs Human Drone Race!
On June 10-11, we organized the first race between an AI-powered vision-based drone vs human pilots. We invited two world champions and the Swiss champion. Read this report by Evan Ackerman from IEEE Spectrum, who witnessed the historic event in person.
July 6, 2022
Code Release: UltimateSLAM
We are releasing UltimateSLAM, which combines events, frames, IMU to achieve the ultimate slam performance in high speed and high dynamic range scenarios.
Paper
Code
Video
Project Webpage
July 5, 2022
IROS2022 Workshop: Agile Robotics: Perception, Learning, Planning, and Control
Do not miss our IROS2022 Workshop: Agile Robotics: Perception, Learning, Planning, and Control!
Checkout the agenda and join the presentations at our
workshop website.
Organized by Giuseppe Loianno, Davide Scaramuzza, Shaojie Shen.
July 4, 2022
Congratulations to our former PhD Antonio for winning the 2022 George Giralt Award!

Congratulations to our former PhD student Antonio Loquercio for winning the 2022 George Giralt PhD Award, the most prestigious award for PhD dissertations in robotics in Europe, for his work on learning vision-based high-speed drone flight! We are very proud of you!
PhD thesis PDF
Video of the PhD defense
Google Scholar profile
Personal page
July 1, 2022
New RA-L Paper: Learning Minimum-Time Flight in Cluttered Environments
We are excited to announce our RA-L paper which tackles minimum-time flight in cluttered environments using
a combination of deep reinforcement learning and classical topological path planning. We show that the approach
outperforms the state-of-the-art in both planning quality and the ability to fly without collisions at high speeds.
For more details, check out the
paper and the
YouTube.
June 17, 2022
New T-RO Paper: "A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight"
We are excited to announce that our paper on A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight was accepted at T-RO 2022.
Our work empirically compares two state-of-the-art control frameworks: the nonlinear-model-predictive controller (NMPC) and the differential-flatness-based controller (DFBC), by tracking a wide variety of agile trajectories at speeds up to 72km/h.
Read our A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight for further details.
June 16, 2022
New RA-L paper: The Hilti SLAM Challenge Dataset

We release the Hilti SLAM Challenge
Dataset!
The sensor platform used to collect this dataset contains a number of visual, lidar and
inertial sensors which have all been rigorously calibrated. All data is temporally aligned
to support precise multi-sensor fusion. Each dataset includes accurate ground truth to allow
direct testing of SLAM results. Raw data as well as intrinsic and extrinsic sensor
calibration data from twelve datasets in various environments is provided. Each environment
represents common scenarios found in building construction sites in various stages of
completion.
For more details, check out the paper, video and talk.
June 13, 2022
"Time Lens++: Event-based Frame Interpolation with Parametric Flow and Multi-scale Fusion" Dataset Release
We are excited to announce that our paper on Time Lens++ was accepted at CVPR 2022. To learn more about the next
generation of event-based frame interpolation visit out
project page
There we release our new dataset BS-ERGB recorded with a beam splitter, which features aligned and synchronized events and frames."
June 3, 2022
Meet us at Swiss Drone Days 2022

We are excited to announce that the 2022 edition of the
Swiss Drone Days will take place on 11-12 June in Dübendorf. The event will feature live demos including autonomous drone racing, inspection, and delivery drone in one of the largest drone flying arenas of the world; spectacular drone races by the Swiss drone league; presentations of distinguished speakers; an exhibition and trade fair. For more information, please visit
www.swissdronedays.com
June 1, 2022
Two New PhD Students
We welcome Drew Hanover and Chao Ni as new PhD students in our lab!
May 27, 2022
Our work won the IEEE RAL Best Paper Award
We are honored that our IEEE Robotics and Automation Letters paper "Autonomous Quadrotor Flight Despite Rotor Failure With Onboard Vision Sensors: Frames vs. Events" was selected for the Best Paper Award. Congratulations to all collaborators!
PDF
YouTube
Code
May 20, 2022
Meet us at ICRA 2022!
We are looking forward to presenting these 9 papers on perception, learning, planning, and control in person next week at IEEE RAS ICRA! Additionally, we will be presenting in many workshops. A full list with links, times, and rooms can be found here
May 5, 2022
UZH lists AI racing-drones as a key finding of 2021
The University of Zurich celebrated its 189th birthday. During the celebrations rector Prof. Michael Schaepman names drones flying faster than humans as a testbed for AI research and search and rescue operations to be one of three key findings of UZH in 2021. A video of the speech can be found here (at 26:00 he starts to talk about drones).
May 4, 2022
New T-RO Paper: "Model Predictive Contouring Control for Time-Optimal Quadrotor Flight"
We are excited to announce that our paper on Model Predictive Contouring Control for Time-Optimal Quadrotor Flight was accepted at T-RO 2022.
Thanks to our Model Predictive Contouring Control, the problem of flying through multiple
waypoints in minimum time can now be solved in real-time.
Read our Model Predictive Contouring Control for
Time-Optimal Quadrotor Flight paper for further details.
May 2, 2022
New Postdoc
We welcome Dr. Marco Cannici as a new postdoc in our lab!
April 28, 2022
EDS: Event-aided Direct Sparse Odometry

We are excited to announce that our paper on Event-aided Direct Sparse Odometry was accepted at CVPR 2022 for an oral presentation. EDS is the first direct method combining events and frames.
This work opens the door to low-power motion-tracking applications where frames are sparingly triggered "on demand'' and our method tracks the motion in between.
For code, video and paper, visit our
project page.
April 21, 2022
We are hiring

We have multiple openings for Phd students and Postdocs in machine learning for computer vision and vision-based robot navigation. Job descriptions and how to apply:
https://rpg.ifi.uzh.ch/positions.html
April 21, 2022
New CVPRW Paper: Multi-Bracket High Dynamic Range Imaging with Event Cameras

We are excited to announce that our paper on combining events and frames for HDR imaging was accepted at the NTIRE22 workshop at CVPR 2022. In this paper, we propose the first multi-bracket HDR pipeline combining a standard camera with an event camera. For more details, check out the
paper and
video.
March 31, 2022
Meet us at Swiss Drone Days 2022

We are excited to announce that the 2022 edition of the
Swiss Drone Days will take place on 11-12 June in D�bendorf. The event will feature live demos including autonomous drone racing, inspection, and delivery drone in one of the largest drone flying arenas of the world; spectacular drone races by the Swiss drone league; presentations of distinguished speakers; an exhibition and trade fair. For more information, please visit
www.swissdronedays.com
March 29, 2022
"AEGNN: Asynchronous Event-based Graph Neural Networks" Code Release
We are excited to announce that our paper on Asynchronous Event-based Graph Neural Networks was accepted at CVPR 2022.
Bring back the sparsity in event-based deep learning by adopting AEGNNs which reduce the computational complexity by up to
200 times. For code, video and paper, visit our
project page.
March 29, 2022
"Are High-Resolution Cameras Really Needed?
In our newest paper we shed light on this question and find that, across a wide range of tasks, this question
has a non-trivial answer. For video and paper, please visit our
project page.
March 17, 2022
ICRA 2022 DodgeDrone Challenge
General-purpose autonomy requires robots to interact with a constantly dynamic and uncertain world. We are excited to announce the ICRA2022 DodgeDrone Challenge to push the limits of aerial navigation in dynamic environments. All we need is you! We provide an easy-to-use API and a Reinforcement Learning framework! Submit your work and take part in the challenge! The winner will get a keynote invitation at the ICRA workshop on aerial robotics and a money prize. Find out how to participate on our
Website. The code is on
GitHub.
March 14, 2022
From our lab to Skydio

Today, Skydio announces that it will be hiring some of our former PhD students. RPG is very proud of them!
Link
March 10, 2022
Davide Scaramuzza interviewed by Robohub
In this interview for Robohub, Davide Scaramuzza talks about event cameras and their application to robotics, automotive, defense, safety and security, computer vision, and videography:
Video and
Article
March 1, 2022
New PLOS ONE Paper: Visual Attention Prediction Improves Performance of Autonomous Drone Racing Agents

We propose a novel method to improve performance in vision-based autonomous drone racing. By combining human eye-gaze based attention prediction and imitation learning, we enable a quadrotor to complete a challenging race track in drone racing simulator. Our method outperforms state-of-the-art methods using raw images and image-based abstractions (i.e., feature tracks). For more details, check out the
paper and
dataset.
February 28, 2022
New RAL Paper: Minimum-Time Quadrotor Waypoint Flight in Cluttered Environments
Planning minimum-time trajectories for quadrotors in the presence of obstacles was, so far, unaddressed by the robotics community. We propose a novel method to plan such trajectories in cluttered environments using a hierarchical, sampling-based method with an incrementally more complex quadrotor model. The proposed method is shown to outperform all related baselines in cluttered environments and is further validated in real-world flights at over 60km/h. Check our paper, video and code.
February 17, 2022
New RAL Paper: Continuous-Time vs. Discrete-Time Vision-based SLAM: A Comparative Study

In this work, we systematically compare the advantages and limitations of the discrete and continuous vision-based SLAM formulations.
We perform an extensive experimental analysis, varying robot type, speed of motion, and sensor modalities. Our experimental analysis suggests that, independently of the trajectory type, continuous-time SLAM is superior to its discrete counterpart whenever the sensors are not time-synchronized. For more details, check out paper and code.
February 15, 2022
Perception-Aware Perching on Powerlines with Multirotors
Multirotor aerial robots are becoming widely used for the inspection of powerlines. To enable continuous, robust inspection without human intervention, the robots must be able to perch on the powerlines to recharge their batteries. This paper presents a novel perching trajectory generation framework that computes perception-aware, collision-free, and dynamically-feasible maneuvers to guide the robot to the desired final state.
For more details, check out the paper and video.
The developed code is available online at code
February 9, 2022
New RAL Paper: Nonlinear MPC for Quadrotor Fault-Tolerant Control
The mechanical simplicity, hover capabilities, and high agility of quadrotors lead to a fast adaption in the industry for inspection, exploration, and urban aerial mobility. On the other hand, the unstable and underactuated dynamics of quadrotors render them highly susceptible to system faults, especially rotor failures. In this work, we propose a fault-tolerant controller using nonlinear model predictive control (NMPC) to stabilize and control a quadrotor subjected to the complete failure of a single rotor. Check our paper and video.
February 4, 2022
UZH-FPV Drone Racing Dataset Standing Leader Board
We are delighted to announce the standing leader board of the UZH-FPV drone racing dataset.
Participants submit the results of their VIO algorithms and receive the evaluation in few minutes thanks to our automatic code evaluation.
For more details, check out the website!
We look forward to receiving your submissions to advance the state-of-the-art of VIO in high speed state estimation.
February 2, 2022
New RAL Paper: Bridging the Gap between Events and Frames through Unsupervised Domain
Adaptation
To overcome the shortage of event-based datasets, we propose a task transfer method that
allows models to be trained directly with labeled images and unlabeled event data.
Our method transfers from single images to events and does not rely on paired sensor data.
Thus, our approach unlocks the vast amount of image datasets for the training of event-based
neural networks.
For more details, check out the paper, video, and code.
January 31, 2022
New RAL Paper: AutoTune: Controller Tuning for High-speed Flight
Tired of tuning your controllers by hand? Check out our RAL22 paper "AutoTune: Controller Tuning for High Speed Flight". We propose a gradient-free method based on Metropolis-Hastings Sampling to automatically find parameters to maximize the performance of a controller during high speed. We outperform both existing methods and human experts! Check paper, video, and code.
January 28, 2022
RPG research on event cameras featured in The Economist!
Excited to see our research on event cameras featured in The Economist! Check it out!
January 10, 2022
RPG research makes it to the top 10 UZH news of 2021!
Our press release on time optimal trajectory planning from July 2021 made it to the top 10 most successful media releases of UZH in 2021, just following the media release on the Alzheimer's FDA approved drug! Check it out!
January 10, 2022
3DV Oral Paper: Dense Optical Flow from Event Cameras
We propose E-RAFT, a novel method to estimate dense optical flow from events only, alongside DSEC-Flow, an extension of DSEC for optical flow estimation.
Download the datasets and submit to the DSEC-Flow benchmark that automatically evaluates your submission.
For more details, check out the paper, video, and project webpage. Our code is available on GitHub.
December 15, 2021
Policy Search for Model Predicitive Control
We propose a novel method to merge reinforcement learning and model predictive control.
Our approach enables a quadrotor to fly through dynamic gates.
The paper has been accepted for publication in the IEEE Transactions on Robotics (T-RO), 2022.
Checkout our paper and the code
December 8, 2021
3DV Paper: Event-based Structured Light
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
Our method is robust to event jitter and therefore performs better at higher scanning speeds.
Experiments demonstrate that our method can deal with high-speed motion and outperform state-of-the-art 3D reconstruction methods based on event cameras, reducing the RMSE by 83% on average, for the same acquisition time.
For more details, check out the
project page,
paper,
code, and
video.
November 1, 2021
Davide Scaramuzza invited speaker at Tartan SLAM Series
The goal of the Tartan SLAM Series is to expand the understanding of those both new and experienced with SLAM.
Sessions include research talks, as well as introductions to various themes of SLAM and thought provoking open-ended discussions. The lineup of events aim to foster fun, provocative discussions on robotics.
In his talk, Davide Scaramuzza speaks about the main progresses of our lab in SLAM over the past years.
He also introduces event-cameras and speaks about their potential applications in visual SLAM.
Check out the slides and the video on Youtube!
October 21, 2021
Code Release: SVO Pro
We are excited to release fully open source SVO Pro! SVO Pro is the latest version of SVO developed over the past few years in our lab.
SVO Pro features the support of different camera models, active exposure control, a sliding window based backend, and global bundle adjustment with loop closure.
Check out the project page and the code on github!
October 20, 2021
New 3DV paper: Event Guided Depth Sensing
We present an efficient bio-inspired event-camera-driven depth sensing algorithm.
Instead of uniformly sensing the depth of the scene, we dynamically illuminate areas of interest densely, depending on the scene activity detected by the event camera, and sparsely illuminate areas in the field of view with no motion.
We show that, in natural scenes like autonomous driving and indoor environments, moving edges correspond to less than 10% of the scene on average. Thus
our setup requires the sensor to scan only 10% of the scene, which could lead to almost 90% less power consumption by the illumination source.
For more details, check out the paper and video.
October 20, 2021
We are hiring!
Come build the future of robotics with us!
We have three fully-funded openings for PhD students and
Postdocs in computer vision and
machine learning
to contribute to the areas of:
- Vision-based agile flight,
- Autonomous inspection of power lines,
- SLAM, Scene Understanding, and Computational Photography
with Event Cameras.
Job descriptions and how to apply.
October 10, 2021
Drone Documentary from the Swiss Italian TV (LA1)
Check out the interview from the Swiss Italian TV LA1 on our research on drone racing and high-speed navigation.
We explain why high-speed drones could make a difference in the future of search and rescue operations.
In Italian with English subtitles!
October 6, 2021
Article Published in Science Robotics!
We are excited to share our latest Science
Robotics paper, done in collaboration with Intel!
An end-to-end policy trained in simulation flies vision-based drones in the wild at up to 40
kph!
In contrast to classic methods, our approach uses a CNN to directly map images to
collision-free trajectories.
This approach radically reduces latency and sensitivity to sensor noise, enabling high-speed
flight.
The end-to-end policy has taken our drones on many adventures in Switzerland!
Check out the video on youtube! We also release
the code and datasets on github!
October 1, 2021
Code Release: Time-Optimal Quadrotor Planning
We are excited to release the code
accompanying our latest Science Robotics
paper on time-optimal quadrotor trajectories!
This provides an example implementation of our novel
progress-based formulation to generate time-optimal trajectories
through multiple waypoints while exploiting, but not violating
the quadrotor's actuation constraints.
Check out our real-world
agile flight footage with explanations and find the
details in the paper
on Science Robotics, and find the code on github.
October 1, 2021
IROS2021 Workshop: Integrated Perception, Learning, and Control for Agile Super Vehicles
Do not miss our IROS2021 Workshop: Integrated Perception, Learning, and Control for Agile
Super Vehicles!
Checkout the agenda and join the presentations at our
workshop website.
Organized by Giuseppe Loianno, Davide Scaramuzza, Sertac Karaman.
The workshop is today, October the 1st, and starts at 3pm Zurich time
(GMT+2).
October 1, 2021
New Arxiv Preprint: The Hilti SLAM Challenge Dataset

We release the Hilti SLAM Challenge
Dataset!
The sensor platform used to collect this dataset contains a number of visual, lidar and
inertial sensors which have all been rigorously calibrated. All data is temporally aligned
to support precise multi-sensor fusion. Each dataset includes accurate ground truth to allow
direct testing of SLAM results. Raw data as well as intrinsic and extrinsic sensor
calibration data from twelve datasets in various environments is provided. Each environment
represents common scenarios found in building construction sites in various stages of
completion.
For more details, check out the paper and video.
September 26, 2021
RPG wins the Tech Briefs "Create the Future" contest for the
category Aerospace and
Defense

Our work on controlling a quadrotor after motor failure with only onboard vision sensors, paper, is the winner of the Aerospace and Defense category
in the 2021 Tech Briefs "Create the Future" contest out of over 700 participants worldwide! Watch the announcement of all the
winners and finalists here.
September 15, 2021
New Arxiv Preprint: Expertise Affects Drone Racing Performance

We present an analysis of drone racing performance of professional and beginner pilots. Our
results show that professional pilots consistently outperform beginner pilots and choose more
optimal racing lines. Our results provide strong evidence for a contribution of expertise to
performances in real-world human-piloted drone racing. We discuss the implications of these
results for future work on autonomous fast and agile flight. For more details, check out the
paper.
September 13, 2021
Our work was selected as IEEE Transactions on Robotics 2020 Best Paper Award finalist
Honored that our IEEE Transactions on Robotics 2020 paper "Deep Drone Racing: From
Simulation to Reality with Domain Randomization" was selected Best Paper Award finalist!
Congratulations to all collaborators for this great achievement!
PDF YouTube
1 YouTube 2 Code
September 13, 2021
Range, Endurance, and Optimal Speed Estimates for Multicopters (Accepted at RAL)

We present an approach to accurately estimate the range, endurance, and optimal flight speed
for general multicopters. This is made possible by combining a state-of-the-art first-principles
aerodynamic multicopter model with an eletric-motor model and a precise graybox battery model.
Additionally, we present an accurate pen-and-paper algorithm developed based on the complex
model
to estimate the range, endurance, and optimal speed of multicopters.
For more details, check out the
paper.
September 10, 2021
New Arxiv Preprint: Performance, Precision, and Payloads: Adaptive Nonlinear MPC for
Quadrotors
We propose L1-NMPC, a novel hybrid adaptive NMPC to learn model uncertainties online and
immediately compensate for them, drastically
improving performance over non-adaptive baselines with minimal computational overhead.
Our proposed architecture generalizes to many different environments from which we evaluate
wind, unknown payloads, and highly agile flight conditions.
For more details, check out the
paper and
video.
September 9, 2021
New Arxiv Preprint: A Comparative Study of Nonlinear MPC and Differential-Flatness-Based
Control for Quadrotor Agile Flight
We perform a comparative study of two state-of-the-art control methods for quadrotor agile
flights from the aspect of trajectory tracking accuracy, robustness, and computational
efficiency.
A wide variety of agile trajectories are tracked in this research at speeds up to 72 km/h. We
show the superiority of NMPC in tracking dynamically infeasible trajectories at the cost of
higher
computation time and risk of numerical convergence issues. An inner-loop controller using the
incremental nonlinear dynamic inversion (INDI) is proposed to hybridize with both methods,
demonstrating more than 78% tracking error reduction. Non-expert readers can regard this work as
a tutorial on agile quadrotor flight.
For more details, check out the
paper and
video.
September 8, 2021
New Arxiv Preprint: Model Predictive Contouring Control for Time-Optimal Quadrotor
Flight
We propose a Model Predictive Contouring Control (MPCC) method fly time-optimal trajectories
through multiple waypoints with quadrotors.
Our MPCC optimally selects the future states of the platform at runtime, while maximizing the
progress along the reference path and minimizing the distance to it.
We show that, even when tracking simplified trajectories, the proposed MPCC results in a path
that approaches the true time-optimal one, and which can be generated in real-time.
We validate our approach in the real-world, where we show that our method outperforms both the
current state-of-the-art and a world-class human pilot in terms of lap time achieving speeds of
up to 60 km/h.
For more details, check out the
paper and
video.
September 2, 2021
HILTI-SLAM Challenge: win up to $10,000 prize money and keynote invitation
RPG and HILTI are organizing the IROS2021 HILTI SLAM Challenge! Participants can win up to
$10,000 prize money and a keynote IROS workshop invitation! Instructions here.
The HILTI SLAM Challenge dataset is a real-life, multi-sensor dataset with accurate ground
truth to advance the state of the art in highly accurate state estimation in challenging
environments. Participants will be ranked by the completeness of their trajectories and by
the achieved accuracy.
HILTI is a multinational company that offers premium
products and services for professionals on construction sites around the globe. Behind this
vast catalog is a global team comprising of 30.000 team members from 133 different
nationalities located in more than 120 countries.
August 29, 2021
New Arxiv Preprint: Dense Optical Flow from Event Cameras
We propose a novel method to estimate dense optical flow from events only, alongside an
extension of DSEC for optical flow estimation.
Our approach takes inspiration from frame-based methods and outperforms previous event-based
approaches with up to 66% EPE reduction.
For more details, check out the paper and video.
August 20, 2021
New IROS Paper & Code Release: Powerline Tracking with Event Cameras
We propose a method that uses event cameras to robustly track lines and show an application
for powerline tracking.
Our method identifies lines in the stream of events by detecting planes in the
spatio-temporal signal, and tracks them through time.
For more details, check out the paper and video.
We release the code fully open
source.
August 17, 2021
Davide Scaramuzza invited speaker at Real Roboticist
The series Real Roboticist, produced by the 2020 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS),
shows the people at the forefront of robotics research from a more personal perspective.
In his talk, Davide Scaramuzza explains his journey from Electronics Engineering to leading
a top robotics vision research group developing a promising technology: event cameras.
He also speaks about the challenges he faced along the way, and even how he combines the
robotics research with another of his passions, magic.
Read
the article and
watch the talk. Enjoy!
August 6, 2021
RPG Contributes to CARLA Optical Flow Camera
CARLA is the world leading simulator for autonomous driving, developed by Intel.
Our lab contributed to the implementation of the optical flow camera,
requested by the community
since the inception the simulator.
Check out the release video for a short
teaser and the documention
for more information on how to use it.
July 21, 2021
Time-Optimal Quadrotor Planning faster than Humans
We are excited to announce our latest work on agile flight
allowing us to generate "time-optimal quadrotor trajectories",
which are faster than human drone racing pilots!
Our novel algorithm published in Science Robotics uses a
progress-based formulation to generate time-optimal trajectories
through multiple waypoints while exploiting, but not violating
the quadrotor's actuator constraints.
Check out our real-world
agile flight footage with explanations and find the
details in the paper
on Science Robotics.
June 30, 2021
The World's Largest Indoor Drone-Testing Arena
We are excited to announce our new, indoor, drone-testing arena!
Equipped with a real-time motion-capture system consisting of 36
Vicon cameras, and with a flight space of over 30x30x8 meters
(7,000 cubic meters), this large research infrastructure allows
us to deploy our most advanced perception, learning, planning,
and control algorithms to push vision-based agile drones to
speeds over 60 km/h and accelerations over 5g.
It also allows us to fly in an unlimited number of virtual
environments using hardware-in-the-loop simulation.
Among the many projects we are currently working on, we aim to
beat the best professional human pilot in a drone race.
Turn up the volume and enjoy the video!
And stay tuned... the best is about to come.. very soon!
June 25, 2021
New RSS Paper & Dataset Release: NeuroBEM
We are happy to announce the release of the full dataset
associated with our upcoming RSS paper NeuroBEM:
Hybrid
Aerodynamic Quadrotor Model.
The dataset features over 1h15min of highly aggressive maneuvers
recorded at high accuracy in one of the worlds largest optical
tracking volumes.
We provide time-aligned quadrotor state and motor-commands
recorded at 400Hz in a curated dataset.
For more details, check out our paper, dataset and video.
June 25, 2021
Fast Feature Tracking with ROS
Our work on GPU-optmized feature detection and tracking is now
available as a simple ROS node.
It implements GPU-optimized Fast, Harris, and Shi-Tomasi
detectors and KLT tracking, running at hundreds of FPS on a
Jetson TX2.
For more details, check out our paper Faster than FAST
and
code.
June 11, 2021
TimeLens: Event-based Video Frame Interpolation
TimeLens is a new
event-based video frame interpolation method that generates high
speed video from
low framerate RGB frames and asynchronous events. Learn more
about TimeLens over at our project page
where you can find code, datasets and more!
We also release a High-Speed Event and RGB dataset which
features complex scenarios like bursting balloons and spinning
objects!
June 10, 2021
Video
recordings of the ICRA 2021 Workshop on Perception
and Action in Dynamic Environments are now available!
On June 4, 2021, Antonio Loquercio (RPG), Davide Scaramuzza
(RPG), Luca Carlone (MIT), and Markus Ryll (TUM)
organized the 1st International Workshop on Perception and
Action in Dynamic Environments at ICRA.
May 18, 2021
Workshop on Perception and Action in Dynamic Environments
Do not miss our #ICRA2021 workshop on Perception and Action in
Dynamic Environments!
Checkout the agenda and join the presentations at our
workshop website.
Organized by Antonio Loquercio, Davide Scaramuzza, Markus Ryll,
Luca Carlone.
The workshop is on June the 4th and starts at 4pm Zurich time
(GMT+2).
May 18, 2021
CVPR competition on stereo matching
We are delighted to announce our CVPR event-based vision
workshop competition on disparity/depth prediction on the new DSEC dataset. Visit
our website
for more details about the competition.
Submission deadline is the 11th of June.
May 18, 2021
Davide Scaramuzza listed among the most influential scholars in
robotics
Congratulations to our lab director, Davide Scaramuzza, for being
listed among the 100 most influential robotics scholar by Aminer
[ Link ].
May 11, 2021
Antonio Loquercio successfully passed his PhD defense
Congratulations to Antonio Loquercio, who has successfully defended
his PhD dissertation titled
"Agile Autonomy: Learning Tightly-Coupled Perception-Action for
High-Speed Quadrotor Flight in the Wild", on May. 10, 2021.
We thank the reviewers: Prof. Pieter Abbeel, Prof. Angela
Schoellig and Prof. Roland Siegwart!
The full video of the PhD defense
presentation is on YouTube.
May 10, 2021
IEEE Transactions on Robotics Best Paper Award Honorable Mention
Our paper Deep Drone Racing: from Simulation to Reality with
Domain Randomization wins the prestigious IEEE Transactions on
Robotics Best Paper Award Honorable Mention: PDF YouTube 1 YouTube 2 Code
May 7, 2021
How to Calibrate Your Event Camera
We propose a generic event camera calibration frame-work using
image reconstruction.
Check out our Code and
PDF
April 30, 2021
DodgeDrone Challenge
We have organized a challenge to push current state of the art
for agile navigation in dynamic environments.
In this challenge, drones will have to avoid moving boulders
while flying in a forest!
Deadline for submission is June the 1st! The winner will
be awarded with a Skydio2!
Partecipate now at https://uzh-rpg.github.io/PADE-ICRA2021/ddc/!
April 26, 2021
Read how our research inspired Ingenuity's flight on Mars
Our research inspired the design of the vision-based navigation
technology behind the Ingenuity helicopter that flew on Mars.
Read the full article on SwissInfo [ English ],
[ Italian ].
April 23, 2021
NASA collaborates with RPG
Our lab is collaborating with NASA/JPL to investigate event
cameras for the next Mars helicopter missions! Read full
interview on SwissInfo with Davide Scaramuzza [ Link ].
April 23, 2021
Davide Scaramuzza invited speaker at GRASP on Robotics
Davide Scaramuzza talks about "Autonomous, Agile Micro Drones:
Perception, Learning, and Control" at GRASP
on Robotics seminar series organized by the GRASP laboratory at
University of Pennsylvania.
In this talk, he shows how the combination of both model-based
and machine learning methods united with
the power of new, low-latency sensors, such as event cameras,
can allow drones to achieve unprecedented
speed and robustness by relying solely on onboard computing.
Watch the
presentation! Enjoy!
April 14, 2021
DSEC: Event Camera Dataset is Out!
DSEC is a new driving dataset with stereo VGA event cameras, RGB
global shutter cameras and disparity
groundtruth from Lidar.
Download DSEC now to reap
the benefits of this multi-modal
dataset with high-quality calibration.
We also accompany the dataset with code and
documentation.
Check out our video,
and
paper
too! Stay tuned for more!
March 18, 2021
Autonomous Drone Racing with Deep Reinforcement Learning
We present Autonomous Drone Racing with Deep RL, the first
learning-based method that can
achieve near-time-optimal performance in drone racing. Checkout
the Preprint
and the Video.
March 15, 2021
1st Workshop on Perception and Action in Dynamic Environments at
ICRA 2021
We organized a #ICRA2021 workshop on perception and action
dynamic environments!
We brought together amazing keynote speakers and also organized
a competition on drone navigation in a
forest (Prize is a Skydio2)! All we need is you!
Check out our website here
for more info and the
current list of invited speakers.
March 8, 2021
Check out our work on Visual Processing and Control in Human Drone
Pilots!
Our work on Visual Processing and Control in Human Drone Pilots
has been accepted in the IEEE Robotics and
Automation Letters. Check out our Video, the Paper, and
Open-Source Dataset
too!
February 19, 2021
Check out our Event Camera Simulator, ESIM, now with python bindings
and GPU support!
Our event camera simulator ESIM now features python
bindings and GPU support for fully parallel event
generation! Check out our
project page, code
and
paper.
February 12, 2021
Check out our work on Combining Events and Frames using Recurrent
Asynchronous Multimodal Networks!
Our work on combining events and frames using recurrent
asynchronous multimodal networks has been accepted
in the IEEE Robotics and Automation Letters. Check out the paper, the
project page,
and the source
code.
February 12, 2021
Check out our work on data-driven MPC for quadrotors!
Our work on data-driven MPC for quadrotors has been accepted in
the IEEE Robotics and Automation Letters.
Check out the paper, the
video, and the
source code.
February 09, 2021
Our work on autonomous flight despite motor failure is featured on
IEEE Spectrum
Our latest work on autonomous quadrotor flight despite rotor
failure with onboard vision sensors (frames
or event cameras) was featured on IEEE
Spectrum. For more details, read the paper here
and watch the video here.
Source code here.
January 25, 2021
3rd Workshop on Event-based Vision at CVPR 2021
We are organizing the "3rd Workshop on Event-based Vision",
which will take place in June at CVPR2021. The
paper submission deadline is March 27. Check out our website here
for more info and the current list of invited
speakers.
January 14, 2021
Check out our work in the new Flying Arena!
Davide Scaramuzza and some of the lab's members talk about our
work on drone racing in the new Flying
Arena. Watch Davide Scaramuzza interview here.
Watch Elia Kaufmann
interview here.
Watch Christian Pfeiffer interview here.
January 13, 2021
Check out our work on how to keep drones flying when a motor fails!
Our work on controlling a quadrotor after motor failure with
only onboard vision sensors has been
accepted in the IEEE Robotics and Automation Letters. Check out
the paper, the video, and the
source code.
January 12, 2021
Paper accepted in IJCV!
Our work on generating accurate reference poses for visual
localization datasets has been accepted in the
International Journal of Computer Vision. Check out the paper here,
and the Aachen Day-Night v1.1 dataset in the paper can be
accessed via the online visual
localization benchmark service.
January 11, 2021
Check our new startup SUIND!
We are super excited to announce SUIND, our latest spin-off!
Leveraging years of research in our lab,
SUIND is building a groundbreaking safety suite for drones.
Proud to see our former members Kunal
Shrivastava and Kevin Kleber making a true impact in the
industry! Read more
here.