Robotics and Perception Group

Past News

December 29, 2022

IEEE Top 10 Robotics Stories of 2022


It's an honor to be featured in the top 10 robotics stories of 2022 by IEEE Spectrum! Kudos and congratulations to our team that made this possible!

December 27, 2022

NCCR Robotics Most Impactful Paper Award


We won the NCCR Robotics Most Impactful Paper Award with the paper "A Machine Learning Approach to Visual Perception of Forest Trails for Mobile Robots". Congrats to Alessandro Giusti and his co-authors!

December 24, 2022

12 Years of NCCR Robotics


After 12 amazing years, NCCR Robotics, the Swiss National Competence of Research in Robotics, has come to an end. I’m very proud to have been part of this! This RoboHub article summarizes all the key achievements, from assistive technologies that allowed patients with completely paralyzed legs to walk again, to winning the DARPA SubT Challenge, to legged and flying robots with self-learning capabilities for use in disaster mitigation as well as in civil and industrial inspection, to robotic startups that have become world leaders, to creating Cybathlon, the world-first Olympic-style competition for athletes with disabilities supported by assistive devices, to educational robots, such as Thymio, that have been used by thousands of children around the world. Congrats to all NCCR Robotics members who have made this possible! NCCR Robotics will continue to operate in four different projects. Check out this article to learn more: link.

December 16, 2022

Survey on visual SLAM for visually impaired people


We present the first survey on visual SLAM for visually impaired people. This technology has tremendous potential to assist people and it will be used, for the first time, in the next Cybathlon competition where we participate. For more information, have a look at our paper and the Cybathlon website.

December 1, 2022

10-Year Lab Anniversary

This week, we celebrate the 10th anniversary of RPG! This video celebrates our anniversary, the over 300 people who worked in our lab as Bsc/Msc/Ph.D. students, postdocs, visiting researchers, all our collaborators, our research sponsors, and the administration people at our university. We thank all of them for contributing to our research. And thank you as well for following our research. The lab made important contributions to autonomous, agile vision-based navigation of micro aerial vehicles and event cameras for mobile robotics and computer vision. Three startups and entrepreneurial projects came out of the lab: the first one, Zurich Eye, became Facebook-Meta Zurich, which contributed to the development of the VR headset Oculus Quest; the second one, Fotokite, makes tethered drones for first responders; the third one, SUIND, makes vision-based drones for precision agriculture. Our researchers won over 50 awards and many paper awards, have published more than 100 scientific articles, which have been cited more than 35 thousand times, and have been featured in many media, including The New York Times, Forbes, and The Economist (media page). We have also released more than 85 open-source software packages, datasets, and toolboxes to further accelerate science advancement and our research's reproducibility (software page). Our algorithms have inspired and have been transferred to many products and companies, including NASA, DJI, Bosch, Nikon, Magic Leap, Meta-Facebook, Huawei, Sony, and Hilti. Thank you for making all this possible! Video.

November 30, 2022

Authorship Attribution through Deep Learning


Can you guess who wrote a paper, just by reading it? We present a transformer-based AI that achieves over 70% accuracy on the newly created, largest-to-date, authorship-attribution dataset with over 2000 authors. For more information check out our paper and open-source code.

November 23, 2022

Pushing the Limits of Asynchronous Graph-based Object Detection with Event Cameras


We introduce various design principles that push the limits of asynchronous graph-based object detection from events by allowing us to design deeper, more powerful models, whithout sacrificing efficiency. While our smallest such model outperforms the best asynchronous methods by 7.4 mAP with 3.7 higher efficiency, our largest model even outperforms dense, feedforward methods, a feat previously unattained by asynchronous methods. For more information, check out our paper.

November 7, 2022

RPG featured in NZZ documentary on Military Drones


In the recent NZZ format documentary on military drones, our lab is featured in its role as a civil research institution working on possible dual-use technology. Our search-and-rescue technology is shown to underline the huge potential of drones to be used in critical missions, possibly saving many lives. Link

November 7, 2022

RPG Drones at the Swiss Robotics Day feature in SRF Tagesschau!

Our autonomous vision-based drones are features in the SRF Tagesschau (05.11.2022) report on the NCCR Swiss Robotics Day in Lausanne. We demonstrate how the technology we develop can be used in GPS-denied environments that are commonly encountered in, for example, search-and-rescue scenarios. YouTube [DE], YouTube [IT], SRF [DE], RSI [IT]

October 28, 2022

The Robotics and Perception Group participated in the parabolic flight campain of UZH Space Hub to study how gravity affects the decision-making of human drone pilots.

October 27, 2022

Learned Inertial Odometry for Autonomous Drone Racing


We propose a learning-based odometry algorithm that uses an inertial measurement unit (IMU) as the only sensor modality for autonomous drone racing tasks. The core idea of our system is to couple a model-based filter, driven by the inertial measurements, with a learning-based module that has access to the control commands. For more information, check out our paper and video.

October 14, 2022

Code release: Data-Efficient Collaborative Decentralized Thermal-Inertial Odometry


We released the code and datasets for our work "Data-Efficient Collaborative Decentralized Thermal-Inertial Odometry" with NASA JPL, extending the already-public JPL xVIO library. With this work, we unleash collaborative drone swarms in the dark, opening new challenging scenarios for the robotics community. For more details, visit the project page.

October 4, 2022

Zero Gravity - RPG participates in Parabolic Flight Campain


Today, we performed our first experiment in reduced, hyper, and zero gravity! Our goal: to study how different g affect self motion estimation in drone pilots in view of future human space missions. This unique opportunity was made possible by the UZH Space Hub and the Netherland Aerospace Center! With Christian Pfeiffer and Leyla Loued-Khenissi. For more information, check out our article or this video.

October 4, 2022

Code release: Event-based Vision meets Deep Learning on Steering Prediction for Self-driving Cars


We are releasing the code for our work which uses event-based vision and deep learning methods to predict the steering angle of self-driving cars. For more details, see our paper.

September 16, 2022

NCCR Robotics Master Thesis Award

Congratulations to our former Master student Michelle Ruegg for winning the NCCR Robotics Master Thesis Award for her thesis on combining frames and events for asynchronous multi-modal monocular depth prediction! The thesis was supervised by Daniel Gehrig and Mathias Gehrig.

September 6, 2022

We are hiring


We have multiple openings for Phd students and Postdocs in Reinforcement Learning for Agile Vision-based Navigation and Computer vision with Standard Cameras and Event Cameras. Job descriptions and how to apply: https://rpg.ifi.uzh.ch/positions.html

September 1, 2022

New Research Assistant

We warmly welcome Nikola Zubić as a new research assistant in our lab!

August 26, 2022

The HILTI SLAM Challenge 2022 paper and dataset is out!


Check out the paper describing the HILTI SLAM Challenge 2022 and the new dataset collected in collaboration with Oxford University. For more details, see our paper and dataset.

August 26, 2022

E-NeRF: Neural Radiance Fields from a Moving Event Camera


Check out our joint paper with Simon Klenk and Daniel Cremers from TU Munich on how to estimate a neural radiance field (NERF) from both a single moving event camera or from an event camera in combination with a standard camera. We show that we can estimate NERF with higher accuracy than standard cameras in scenes affected by motion blur or when only a few sparse frames are available. For more details, see our paper.

August 2, 2022

New ECCV Paper: ESS: Learning Event-based Semantic Segmentation from Still Images


We are excited to announce our ECCV paper, which overcomes the lack of semantic segmentation datasets for event cameras by directly transferring the semantic segmentation task from existing labeled image datasets to unlabeled events. Our approach neither requires video data nor per-pixel alignment between images and events. For more details, check out the paper, video, code, and dataset.

August 1, 2022

New Research Assistant

We warmly welcome Vincenzo Polizzi as a new research assistant in our lab!

July 31, 2022

RPG on the main German TV Kids program "1, 2 oder 3" on ZDF!


Leonard Bauersfeld and Elia Kaufmann were invited to the famous German TV program "1, 2 oder 3" to talk about drones. Watch the full video in the ZDF Mediathek here (available until 28.08.2022). The part featuring RPG starts at 14:45.
Photo: ZDF/Ralf Wilschewski.

July 29, 2022

Dataset and Code release for EKLT-VIO


We are excited to announce that the code and datasets for our RA-L paper Exploring Event Camera-based Odometry for Planetary Robots is released. Both code and datasets can be found here.

July 21, 2022

Time-optimal Online Replanning for Agile Quadrotor Flight

For the first time, a time-optimal trajectory can be generated and tracked in real-time, even with moving waypoints and strong unknown disturbances! Read our Time-optimal Online Replanning for Agile Quadrotor Flight paper and watch our IROS talk for further details.

July 13, 2022

RPG on the main Italian TV science program SuperQuark on RAI1!

Watch the full video report about our research on autonomous drones, from drone racing to search and rescue, from standard to event cameras. The video is in Italian with English subtitles.

July 7, 2022

First AI vs Human Drone Race!


On June 10-11, we organized the first race between an AI-powered vision-based drone vs human pilots. We invited two world champions and the Swiss champion. Read this report by Evan Ackerman from IEEE Spectrum, who witnessed the historic event in person.

July 6, 2022

Code Release: UltimateSLAM


We are releasing UltimateSLAM, which combines events, frames, IMU to achieve the ultimate slam performance in high speed and high dynamic range scenarios. Paper Code Video Project Webpage

July 5, 2022

IROS2022 Workshop: Agile Robotics: Perception, Learning, Planning, and Control


Do not miss our IROS2022 Workshop: Agile Robotics: Perception, Learning, Planning, and Control! Checkout the agenda and join the presentations at our workshop website. Organized by Giuseppe Loianno, Davide Scaramuzza, Shaojie Shen.

July 4, 2022

Congratulations to our former PhD Antonio for winning the 2022 George Giralt Award!


Congratulations to our former PhD student Antonio Loquercio for winning the 2022 George Giralt PhD Award, the most prestigious award for PhD dissertations in robotics in Europe, for his work on learning vision-based high-speed drone flight! We are very proud of you!
PhD thesis PDF
Video of the PhD defense
Google Scholar profile
Personal page

July 1, 2022

New RA-L Paper: Learning Minimum-Time Flight in Cluttered Environments

We are excited to announce our RA-L paper which tackles minimum-time flight in cluttered environments using a combination of deep reinforcement learning and classical topological path planning. We show that the approach outperforms the state-of-the-art in both planning quality and the ability to fly without collisions at high speeds. For more details, check out the paper and the YouTube.

June 17, 2022

New T-RO Paper: "A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight"

We are excited to announce that our paper on A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight was accepted at T-RO 2022. Our work empirically compares two state-of-the-art control frameworks: the nonlinear-model-predictive controller (NMPC) and the differential-flatness-based controller (DFBC), by tracking a wide variety of agile trajectories at speeds up to 72km/h. Read our A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight for further details.

June 16, 2022

New RA-L paper: The Hilti SLAM Challenge Dataset


We release the Hilti SLAM Challenge Dataset! The sensor platform used to collect this dataset contains a number of visual, lidar and inertial sensors which have all been rigorously calibrated. All data is temporally aligned to support precise multi-sensor fusion. Each dataset includes accurate ground truth to allow direct testing of SLAM results. Raw data as well as intrinsic and extrinsic sensor calibration data from twelve datasets in various environments is provided. Each environment represents common scenarios found in building construction sites in various stages of completion. For more details, check out the paper, video and talk.

June 13, 2022

"Time Lens++: Event-based Frame Interpolation with Parametric Flow and Multi-scale Fusion" Dataset Release

We are excited to announce that our paper on Time Lens++ was accepted at CVPR 2022. To learn more about the next generation of event-based frame interpolation visit out project page There we release our new dataset BS-ERGB recorded with a beam splitter, which features aligned and synchronized events and frames."

June 3, 2022

Meet us at Swiss Drone Days 2022


We are excited to announce that the 2022 edition of the Swiss Drone Days will take place on 11-12 June in Dübendorf. The event will feature live demos including autonomous drone racing, inspection, and delivery drone in one of the largest drone flying arenas of the world; spectacular drone races by the Swiss drone league; presentations of distinguished speakers; an exhibition and trade fair. For more information, please visit www.swissdronedays.com

June 1, 2022

Two New PhD Students

We welcome Drew Hanover and Chao Ni as new PhD students in our lab!

May 27, 2022

Our work won the IEEE RAL Best Paper Award



We are honored that our IEEE Robotics and Automation Letters paper "Autonomous Quadrotor Flight Despite Rotor Failure With Onboard Vision Sensors: Frames vs. Events" was selected for the Best Paper Award. Congratulations to all collaborators!

PDF YouTube Code

May 20, 2022

Meet us at ICRA 2022!



We are looking forward to presenting these 9 papers on perception, learning, planning, and control in person next week at IEEE RAS ICRA! Additionally, we will be presenting in many workshops. A full list with links, times, and rooms can be found here

May 5, 2022

UZH lists AI racing-drones as a key finding of 2021

The University of Zurich celebrated its 189th birthday. During the celebrations rector Prof. Michael Schaepman names drones flying faster than humans as a testbed for AI research and search and rescue operations to be one of three key findings of UZH in 2021. A video of the speech can be found here (at 26:00 he starts to talk about drones).

May 4, 2022

New T-RO Paper: "Model Predictive Contouring Control for Time-Optimal Quadrotor Flight"

We are excited to announce that our paper on Model Predictive Contouring Control for Time-Optimal Quadrotor Flight was accepted at T-RO 2022. Thanks to our Model Predictive Contouring Control, the problem of flying through multiple waypoints in minimum time can now be solved in real-time. Read our Model Predictive Contouring Control for Time-Optimal Quadrotor Flight paper for further details.

May 2, 2022

New Postdoc

We welcome Dr. Marco Cannici as a new postdoc in our lab!

April 28, 2022

EDS: Event-aided Direct Sparse Odometry


We are excited to announce that our paper on Event-aided Direct Sparse Odometry was accepted at CVPR 2022 for an oral presentation. EDS is the first direct method combining events and frames. This work opens the door to low-power motion-tracking applications where frames are sparingly triggered "on demand'' and our method tracks the motion in between. For code, video and paper, visit our project page.

April 21, 2022

We are hiring


We have multiple openings for Phd students and Postdocs in machine learning for computer vision and vision-based robot navigation. Job descriptions and how to apply: https://rpg.ifi.uzh.ch/positions.html

April 21, 2022

New CVPRW Paper: Multi-Bracket High Dynamic Range Imaging with Event Cameras


We are excited to announce that our paper on combining events and frames for HDR imaging was accepted at the NTIRE22 workshop at CVPR 2022. In this paper, we propose the first multi-bracket HDR pipeline combining a standard camera with an event camera. For more details, check out the paper and video.

March 31, 2022

Meet us at Swiss Drone Days 2022


We are excited to announce that the 2022 edition of the Swiss Drone Days will take place on 11-12 June in D�bendorf. The event will feature live demos including autonomous drone racing, inspection, and delivery drone in one of the largest drone flying arenas of the world; spectacular drone races by the Swiss drone league; presentations of distinguished speakers; an exhibition and trade fair. For more information, please visit www.swissdronedays.com

March 29, 2022

"AEGNN: Asynchronous Event-based Graph Neural Networks" Code Release

We are excited to announce that our paper on Asynchronous Event-based Graph Neural Networks was accepted at CVPR 2022. Bring back the sparsity in event-based deep learning by adopting AEGNNs which reduce the computational complexity by up to 200 times. For code, video and paper, visit our project page.

March 29, 2022

"Are High-Resolution Cameras Really Needed?

In our newest paper we shed light on this question and find that, across a wide range of tasks, this question has a non-trivial answer. For video and paper, please visit our project page.

March 17, 2022

ICRA 2022 DodgeDrone Challenge

General-purpose autonomy requires robots to interact with a constantly dynamic and uncertain world. We are excited to announce the ICRA2022 DodgeDrone Challenge to push the limits of aerial navigation in dynamic environments. All we need is you! We provide an easy-to-use API and a Reinforcement Learning framework! Submit your work and take part in the challenge! The winner will get a keynote invitation at the ICRA workshop on aerial robotics and a money prize. Find out how to participate on our Website. The code is on GitHub.

March 14, 2022

From our lab to Skydio


Today, Skydio announces that it will be hiring some of our former PhD students. RPG is very proud of them! Link

March 10, 2022

Davide Scaramuzza interviewed by Robohub

In this interview for Robohub, Davide Scaramuzza talks about event cameras and their application to robotics, automotive, defense, safety and security, computer vision, and videography: Video and Article

March 1, 2022

New PLOS ONE Paper: Visual Attention Prediction Improves Performance of Autonomous Drone Racing Agents


We propose a novel method to improve performance in vision-based autonomous drone racing. By combining human eye-gaze based attention prediction and imitation learning, we enable a quadrotor to complete a challenging race track in drone racing simulator. Our method outperforms state-of-the-art methods using raw images and image-based abstractions (i.e., feature tracks). For more details, check out the paper and dataset.

February 28, 2022

New RAL Paper: Minimum-Time Quadrotor Waypoint Flight in Cluttered Environments


Planning minimum-time trajectories for quadrotors in the presence of obstacles was, so far, unaddressed by the robotics community. We propose a novel method to plan such trajectories in cluttered environments using a hierarchical, sampling-based method with an incrementally more complex quadrotor model. The proposed method is shown to outperform all related baselines in cluttered environments and is further validated in real-world flights at over 60km/h. Check our paper, video and code.


February 17, 2022

New RAL Paper: Continuous-Time vs. Discrete-Time Vision-based SLAM: A Comparative Study


In this work, we systematically compare the advantages and limitations of the discrete and continuous vision-based SLAM formulations. We perform an extensive experimental analysis, varying robot type, speed of motion, and sensor modalities. Our experimental analysis suggests that, independently of the trajectory type, continuous-time SLAM is superior to its discrete counterpart whenever the sensors are not time-synchronized. For more details, check out paper and code.

February 15, 2022

Perception-Aware Perching on Powerlines with Multirotors


Multirotor aerial robots are becoming widely used for the inspection of powerlines. To enable continuous, robust inspection without human intervention, the robots must be able to perch on the powerlines to recharge their batteries. This paper presents a novel perching trajectory generation framework that computes perception-aware, collision-free, and dynamically-feasible maneuvers to guide the robot to the desired final state. For more details, check out the paper and video. The developed code is available online at code

February 9, 2022

New RAL Paper: Nonlinear MPC for Quadrotor Fault-Tolerant Control


The mechanical simplicity, hover capabilities, and high agility of quadrotors lead to a fast adaption in the industry for inspection, exploration, and urban aerial mobility. On the other hand, the unstable and underactuated dynamics of quadrotors render them highly susceptible to system faults, especially rotor failures. In this work, we propose a fault-tolerant controller using nonlinear model predictive control (NMPC) to stabilize and control a quadrotor subjected to the complete failure of a single rotor. Check our paper and video.

February 4, 2022

UZH-FPV Drone Racing Dataset Standing Leader Board


We are delighted to announce the standing leader board of the UZH-FPV drone racing dataset. Participants submit the results of their VIO algorithms and receive the evaluation in few minutes thanks to our automatic code evaluation. For more details, check out the website! We look forward to receiving your submissions to advance the state-of-the-art of VIO in high speed state estimation.

February 2, 2022

New RAL Paper: Bridging the Gap between Events and Frames through Unsupervised Domain Adaptation


To overcome the shortage of event-based datasets, we propose a task transfer method that allows models to be trained directly with labeled images and unlabeled event data. Our method transfers from single images to events and does not rely on paired sensor data. Thus, our approach unlocks the vast amount of image datasets for the training of event-based neural networks. For more details, check out the paper, video, and code.

January 31, 2022

New RAL Paper: AutoTune: Controller Tuning for High-speed Flight


Tired of tuning your controllers by hand? Check out our RAL22 paper "AutoTune: Controller Tuning for High Speed Flight". We propose a gradient-free method based on Metropolis-Hastings Sampling to automatically find parameters to maximize the performance of a controller during high speed. We outperform both existing methods and human experts! Check paper, video, and code.

January 28, 2022

RPG research on event cameras featured in The Economist!


Excited to see our research on event cameras featured in The Economist! Check it out!

January 10, 2022

RPG research makes it to the top 10 UZH news of 2021!


Our press release on time optimal trajectory planning from July 2021 made it to the top 10 most successful media releases of UZH in 2021, just following the media release on the Alzheimer's FDA approved drug! Check it out!

January 10, 2022

3DV Oral Paper: Dense Optical Flow from Event Cameras


We propose E-RAFT, a novel method to estimate dense optical flow from events only, alongside DSEC-Flow, an extension of DSEC for optical flow estimation. Download the datasets and submit to the DSEC-Flow benchmark that automatically evaluates your submission. For more details, check out the paper, video, and project webpage. Our code is available on GitHub.

December 20, 2021

Philipp Foehn successfully passed his PhD defense


Congratulations to Philipp Foehn, who has successfully defended his PhD dissertation titled "Agile Aerial Autonomy: Planning and control", on December 14, 2021. We thank the reviewers: Prof. Moritz Diehl, Prof. Luca Carlone, and Prof. Roland Siegwart!

The full video of the PhD defense presentation is on YouTube.

December 15, 2021

Policy Search for Model Predicitive Control

We propose a novel method to merge reinforcement learning and model predictive control. Our approach enables a quadrotor to fly through dynamic gates. The paper has been accepted for publication in the IEEE Transactions on Robotics (T-RO), 2022. Checkout our paper and the code

December 9, 2021

Code Release: Event-based, Direct Camera Tracking

We release the code of our ICRA 2019 paper Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization. The code is implemented in C++ and runs in real-time on a laptop. Try it out for yourself on GitHub!

December 8, 2021

3DV Paper: Event-based Structured Light

We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing. Our method is robust to event jitter and therefore performs better at higher scanning speeds. Experiments demonstrate that our method can deal with high-speed motion and outperform state-of-the-art 3D reconstruction methods based on event cameras, reducing the RMSE by 83% on average, for the same acquisition time. For more details, check out the project page, paper, code, and video.

November 1, 2021

Davide Scaramuzza invited speaker at Tartan SLAM Series

The goal of the Tartan SLAM Series is to expand the understanding of those both new and experienced with SLAM. Sessions include research talks, as well as introductions to various themes of SLAM and thought provoking open-ended discussions. The lineup of events aim to foster fun, provocative discussions on robotics. In his talk, Davide Scaramuzza speaks about the main progresses of our lab in SLAM over the past years. He also introduces event-cameras and speaks about their potential applications in visual SLAM. Check out the slides and the video on Youtube!

October 21, 2021

Code Release: SVO Pro


We are excited to release fully open source SVO Pro! SVO Pro is the latest version of SVO developed over the past few years in our lab. SVO Pro features the support of different camera models, active exposure control, a sliding window based backend, and global bundle adjustment with loop closure. Check out the project page and the code on github!

October 20, 2021

New 3DV paper: Event Guided Depth Sensing


We present an efficient bio-inspired event-camera-driven depth sensing algorithm. Instead of uniformly sensing the depth of the scene, we dynamically illuminate areas of interest densely, depending on the scene activity detected by the event camera, and sparsely illuminate areas in the field of view with no motion. We show that, in natural scenes like autonomous driving and indoor environments, moving edges correspond to less than 10% of the scene on average. Thus our setup requires the sensor to scan only 10% of the scene, which could lead to almost 90% less power consumption by the illumination source. For more details, check out the paper and video.

October 20, 2021

We are hiring!
Come build the future of robotics with us!


We have three fully-funded openings for PhD students and Postdocs in computer vision and machine learning to contribute to the areas of:

  • Vision-based agile flight,
  • Autonomous inspection of power lines,
  • SLAM, Scene Understanding, and Computational Photography with Event Cameras.
Job descriptions and how to apply.

October 10, 2021

Drone Documentary from the Swiss Italian TV (LA1)

Check out the interview from the Swiss Italian TV LA1 on our research on drone racing and high-speed navigation. We explain why high-speed drones could make a difference in the future of search and rescue operations. In Italian with English subtitles!

October 6, 2021

Article Published in Science Robotics!


We are excited to share our latest Science Robotics paper, done in collaboration with Intel! An end-to-end policy trained in simulation flies vision-based drones in the wild at up to 40 kph! In contrast to classic methods, our approach uses a CNN to directly map images to collision-free trajectories. This approach radically reduces latency and sensitivity to sensor noise, enabling high-speed flight. The end-to-end policy has taken our drones on many adventures in Switzerland!
Check out the video on youtube! We also release the code and datasets on github!

October 1, 2021

Code Release: Time-Optimal Quadrotor Planning

We are excited to release the code accompanying our latest Science Robotics paper on time-optimal quadrotor trajectories! This provides an example implementation of our novel progress-based formulation to generate time-optimal trajectories through multiple waypoints while exploiting, but not violating the quadrotor's actuation constraints.
Check out our real-world agile flight footage with explanations and find the details in the paper on Science Robotics, and find the code on github.

October 1, 2021

IROS2021 Workshop: Integrated Perception, Learning, and Control for Agile Super Vehicles


Do not miss our IROS2021 Workshop: Integrated Perception, Learning, and Control for Agile Super Vehicles! Checkout the agenda and join the presentations at our workshop website. Organized by Giuseppe Loianno, Davide Scaramuzza, Sertac Karaman.

The workshop is today, October the 1st, and starts at 3pm Zurich time (GMT+2).

October 1, 2021

New Arxiv Preprint: The Hilti SLAM Challenge Dataset


We release the Hilti SLAM Challenge Dataset! The sensor platform used to collect this dataset contains a number of visual, lidar and inertial sensors which have all been rigorously calibrated. All data is temporally aligned to support precise multi-sensor fusion. Each dataset includes accurate ground truth to allow direct testing of SLAM results. Raw data as well as intrinsic and extrinsic sensor calibration data from twelve datasets in various environments is provided. Each environment represents common scenarios found in building construction sites in various stages of completion. For more details, check out the paper and video.

September 26, 2021

RPG wins the Tech Briefs "Create the Future" contest for the category Aerospace and Defense


Our work on controlling a quadrotor after motor failure with only onboard vision sensors, paper, is the winner of the Aerospace and Defense category in the 2021 Tech Briefs "Create the Future" contest out of over 700 participants worldwide! Watch the announcement of all the winners and finalists here.

September 15, 2021

New Arxiv Preprint: Expertise Affects Drone Racing Performance


We present an analysis of drone racing performance of professional and beginner pilots. Our results show that professional pilots consistently outperform beginner pilots and choose more optimal racing lines. Our results provide strong evidence for a contribution of expertise to performances in real-world human-piloted drone racing. We discuss the implications of these results for future work on autonomous fast and agile flight. For more details, check out the paper.

September 13, 2021

Our work was selected as IEEE Transactions on Robotics 2020 Best Paper Award finalist


Honored that our IEEE Transactions on Robotics 2020 paper "Deep Drone Racing: From Simulation to Reality with Domain Randomization" was selected Best Paper Award finalist! Congratulations to all collaborators for this great achievement! PDF YouTube 1 YouTube 2 Code

September 13, 2021

Range, Endurance, and Optimal Speed Estimates for Multicopters (Accepted at RAL)


We present an approach to accurately estimate the range, endurance, and optimal flight speed for general multicopters. This is made possible by combining a state-of-the-art first-principles aerodynamic multicopter model with an eletric-motor model and a precise graybox battery model. Additionally, we present an accurate pen-and-paper algorithm developed based on the complex model to estimate the range, endurance, and optimal speed of multicopters. For more details, check out the paper.

September 10, 2021

New Arxiv Preprint: Performance, Precision, and Payloads: Adaptive Nonlinear MPC for Quadrotors

We propose L1-NMPC, a novel hybrid adaptive NMPC to learn model uncertainties online and immediately compensate for them, drastically improving performance over non-adaptive baselines with minimal computational overhead. Our proposed architecture generalizes to many different environments from which we evaluate wind, unknown payloads, and highly agile flight conditions. For more details, check out the paper and video.

September 9, 2021

New Arxiv Preprint: A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight

We perform a comparative study of two state-of-the-art control methods for quadrotor agile flights from the aspect of trajectory tracking accuracy, robustness, and computational efficiency. A wide variety of agile trajectories are tracked in this research at speeds up to 72 km/h. We show the superiority of NMPC in tracking dynamically infeasible trajectories at the cost of higher computation time and risk of numerical convergence issues. An inner-loop controller using the incremental nonlinear dynamic inversion (INDI) is proposed to hybridize with both methods, demonstrating more than 78% tracking error reduction. Non-expert readers can regard this work as a tutorial on agile quadrotor flight. For more details, check out the paper and video.

September 8, 2021

New Arxiv Preprint: Model Predictive Contouring Control for Time-Optimal Quadrotor Flight

We propose a Model Predictive Contouring Control (MPCC) method fly time-optimal trajectories through multiple waypoints with quadrotors. Our MPCC optimally selects the future states of the platform at runtime, while maximizing the progress along the reference path and minimizing the distance to it. We show that, even when tracking simplified trajectories, the proposed MPCC results in a path that approaches the true time-optimal one, and which can be generated in real-time. We validate our approach in the real-world, where we show that our method outperforms both the current state-of-the-art and a world-class human pilot in terms of lap time achieving speeds of up to 60 km/h. For more details, check out the paper and video.

September 2, 2021

HILTI-SLAM Challenge: win up to $10,000 prize money and keynote invitation


RPG and HILTI are organizing the IROS2021 HILTI SLAM Challenge! Participants can win up to $10,000 prize money and a keynote IROS workshop invitation! Instructions here. The HILTI SLAM Challenge dataset is a real-life, multi-sensor dataset with accurate ground truth to advance the state of the art in highly accurate state estimation in challenging environments. Participants will be ranked by the completeness of their trajectories and by the achieved accuracy. HILTI is a multinational company that offers premium products and services for professionals on construction sites around the globe. Behind this vast catalog is a global team comprising of 30.000 team members from 133 different nationalities located in more than 120 countries.

August 29, 2021

New Arxiv Preprint: Dense Optical Flow from Event Cameras


We propose a novel method to estimate dense optical flow from events only, alongside an extension of DSEC for optical flow estimation. Our approach takes inspiration from frame-based methods and outperforms previous event-based approaches with up to 66% EPE reduction. For more details, check out the paper and video.

August 20, 2021

New IROS Paper & Code Release: Powerline Tracking with Event Cameras


We propose a method that uses event cameras to robustly track lines and show an application for powerline tracking. Our method identifies lines in the stream of events by detecting planes in the spatio-temporal signal, and tracks them through time. For more details, check out the paper and video. We release the code fully open source.

August 17, 2021

Davide Scaramuzza invited speaker at Real Roboticist


The series Real Roboticist, produced by the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), shows the people at the forefront of robotics research from a more personal perspective. In his talk, Davide Scaramuzza explains his journey from Electronics Engineering to leading a top robotics vision research group developing a promising technology: event cameras. He also speaks about the challenges he faced along the way, and even how he combines the robotics research with another of his passions, magic. Read the article and watch the talk. Enjoy!

August 6, 2021

RPG Contributes to CARLA Optical Flow Camera


CARLA is the world leading simulator for autonomous driving, developed by Intel. Our lab contributed to the implementation of the optical flow camera, requested by the community since the inception the simulator.
Check out the release video for a short teaser and the documention for more information on how to use it.

July 21, 2021

Time-Optimal Quadrotor Planning faster than Humans

We are excited to announce our latest work on agile flight allowing us to generate "time-optimal quadrotor trajectories", which are faster than human drone racing pilots! Our novel algorithm published in Science Robotics uses a progress-based formulation to generate time-optimal trajectories through multiple waypoints while exploiting, but not violating the quadrotor's actuator constraints.
Check out our real-world agile flight footage with explanations and find the details in the paper on Science Robotics.

June 30, 2021

The World's Largest Indoor Drone-Testing Arena

We are excited to announce our new, indoor, drone-testing arena! Equipped with a real-time motion-capture system consisting of 36 Vicon cameras, and with a flight space of over 30x30x8 meters (7,000 cubic meters), this large research infrastructure allows us to deploy our most advanced perception, learning, planning, and control algorithms to push vision-based agile drones to speeds over 60 km/h and accelerations over 5g. It also allows us to fly in an unlimited number of virtual environments using hardware-in-the-loop simulation. Among the many projects we are currently working on, we aim to beat the best professional human pilot in a drone race. Turn up the volume and enjoy the video! And stay tuned... the best is about to come.. very soon!

June 30, 2021

Code Release: EVO: Event-based, 6-DOF Parallel Tracking and Mapping in Real-Time

We release EVO, an Event-based Visual Odometry algorithm from our RA-L paper EVO: Event-based, 6-DOF Parallel Tracking and Mapping in Real-Time. The code is implemented in C++ and runs in real-time on a laptop. Try it out for yourself on GitHub!

June 25, 2021

New RSS Paper & Dataset Release: NeuroBEM


We are happy to announce the release of the full dataset associated with our upcoming RSS paper NeuroBEM: Hybrid Aerodynamic Quadrotor Model. The dataset features over 1h15min of highly aggressive maneuvers recorded at high accuracy in one of the worlds largest optical tracking volumes. We provide time-aligned quadrotor state and motor-commands recorded at 400Hz in a curated dataset. For more details, check out our paper, dataset and video.

June 25, 2021

Fast Feature Tracking with ROS


Our work on GPU-optmized feature detection and tracking is now available as a simple ROS node. It implements GPU-optimized Fast, Harris, and Shi-Tomasi detectors and KLT tracking, running at hundreds of FPS on a Jetson TX2. For more details, check out our paper Faster than FAST and code.

June 11, 2021

TimeLens: Event-based Video Frame Interpolation

TimeLens is a new event-based video frame interpolation method that generates high speed video from low framerate RGB frames and asynchronous events. Learn more about TimeLens over at our project page where you can find code, datasets and more! We also release a High-Speed Event and RGB dataset which features complex scenarios like bursting balloons and spinning objects!

June 10, 2021

Video recordings of the ICRA 2021 Workshop on Perception and Action in Dynamic Environments are now available!

On June 4, 2021, Antonio Loquercio (RPG), Davide Scaramuzza (RPG), Luca Carlone (MIT), and Markus Ryll (TUM) organized the 1st International Workshop on Perception and Action in Dynamic Environments at ICRA.


May 18, 2021

Workshop on Perception and Action in Dynamic Environments


Do not miss our #ICRA2021 workshop on Perception and Action in Dynamic Environments! Checkout the agenda and join the presentations at our workshop website. Organized by Antonio Loquercio, Davide Scaramuzza, Markus Ryll, Luca Carlone.

The workshop is on June the 4th and starts at 4pm Zurich time (GMT+2).

May 18, 2021

CVPR competition on stereo matching


We are delighted to announce our CVPR event-based vision workshop competition on disparity/depth prediction on the new DSEC dataset. Visit our website for more details about the competition. Submission deadline is the 11th of June.

May 18, 2021

Davide Scaramuzza listed among the most influential scholars in robotics


Congratulations to our lab director, Davide Scaramuzza, for being listed among the 100 most influential robotics scholar by Aminer [ Link ].


May 11, 2021

Antonio Loquercio successfully passed his PhD defense


Congratulations to Antonio Loquercio, who has successfully defended his PhD dissertation titled "Agile Autonomy: Learning Tightly-Coupled Perception-Action for High-Speed Quadrotor Flight in the Wild", on May. 10, 2021. We thank the reviewers: Prof. Pieter Abbeel, Prof. Angela Schoellig and Prof. Roland Siegwart!

The full video of the PhD defense presentation is on YouTube.

May 10, 2021

IEEE Transactions on Robotics Best Paper Award Honorable Mention


Our paper Deep Drone Racing: from Simulation to Reality with Domain Randomization wins the prestigious IEEE Transactions on Robotics Best Paper Award Honorable Mention: PDF YouTube 1 YouTube 2 Code

May 7, 2021

How to Calibrate Your Event Camera


We propose a generic event camera calibration frame-work using image reconstruction. Check out our Code and PDF

April 30, 2021

DodgeDrone Challenge


We have organized a challenge to push current state of the art for agile navigation in dynamic environments. In this challenge, drones will have to avoid moving boulders while flying in a forest! Deadline for submission is June the 1st! The winner will be awarded with a Skydio2! Partecipate now at https://uzh-rpg.github.io/PADE-ICRA2021/ddc/!

April 26, 2021

Read how our research inspired Ingenuity's flight on Mars


Our research inspired the design of the vision-based navigation technology behind the Ingenuity helicopter that flew on Mars. Read the full article on SwissInfo [ English ], [ Italian ].

April 23, 2021

NASA collaborates with RPG


Our lab is collaborating with NASA/JPL to investigate event cameras for the next Mars helicopter missions! Read full interview on SwissInfo with Davide Scaramuzza [ Link ].

April 23, 2021

Davide Scaramuzza invited speaker at GRASP on Robotics


Davide Scaramuzza talks about "Autonomous, Agile Micro Drones: Perception, Learning, and Control" at GRASP on Robotics seminar series organized by the GRASP laboratory at University of Pennsylvania. In this talk, he shows how the combination of both model-based and machine learning methods united with the power of new, low-latency sensors, such as event cameras, can allow drones to achieve unprecedented speed and robustness by relying solely on onboard computing. Watch the presentation! Enjoy!

April 19, 2021

Autonomous Racing and Overtaking in GTS using Reinforcement Learning


We present Super-Human Performance in GTS Using Deep RL and Autonomous Overtaking in GTS Using Curriculum RL. Checkout the Website.

April 14, 2021

DSEC: Event Camera Dataset is Out!


DSEC is a new driving dataset with stereo VGA event cameras, RGB global shutter cameras and disparity groundtruth from Lidar. Download DSEC now to reap the benefits of this multi-modal dataset with high-quality calibration.
We also accompany the dataset with code and documentation. Check out our video, and paper too! Stay tuned for more!

March 18, 2021

Autonomous Drone Racing with Deep Reinforcement Learning


We present Autonomous Drone Racing with Deep RL, the first learning-based method that can achieve near-time-optimal performance in drone racing. Checkout the Preprint and the Video.

March 15, 2021

1st Workshop on Perception and Action in Dynamic Environments at ICRA 2021


We organized a #ICRA2021 workshop on perception and action dynamic environments! We brought together amazing keynote speakers and also organized a competition on drone navigation in a forest (Prize is a Skydio2)! All we need is you! Check out our website here for more info and the current list of invited speakers.

March 8, 2021

Check out our work on Visual Processing and Control in Human Drone Pilots!


Our work on Visual Processing and Control in Human Drone Pilots has been accepted in the IEEE Robotics and Automation Letters. Check out our Video, the Paper, and Open-Source Dataset too!

February 19, 2021

Check out our Event Camera Simulator, ESIM, now with python bindings and GPU support!


Our event camera simulator ESIM now features python bindings and GPU support for fully parallel event generation! Check out our project page, code and paper.

February 12, 2021

Check out our work on Combining Events and Frames using Recurrent Asynchronous Multimodal Networks!


Our work on combining events and frames using recurrent asynchronous multimodal networks has been accepted in the IEEE Robotics and Automation Letters. Check out the paper, the project page, and the source code.

February 12, 2021

Check out our work on data-driven MPC for quadrotors!


Our work on data-driven MPC for quadrotors has been accepted in the IEEE Robotics and Automation Letters. Check out the paper, the video, and the source code.

February 09, 2021

Our work on autonomous flight despite motor failure is featured on IEEE Spectrum


Our latest work on autonomous quadrotor flight despite rotor failure with onboard vision sensors (frames or event cameras) was featured on IEEE Spectrum. For more details, read the paper here and watch the video here. Source code here.

January 25, 2021

3rd Workshop on Event-based Vision at CVPR 2021


We are organizing the "3rd Workshop on Event-based Vision", which will take place in June at CVPR2021. The paper submission deadline is March 27. Check out our website here for more info and the current list of invited speakers.

January 14, 2021

Check out our work in the new Flying Arena!


Davide Scaramuzza and some of the lab's members talk about our work on drone racing in the new Flying Arena. Watch Davide Scaramuzza interview here. Watch Elia Kaufmann interview here. Watch Christian Pfeiffer interview here.

January 13, 2021

Check out our work on how to keep drones flying when a motor fails!


Our work on controlling a quadrotor after motor failure with only onboard vision sensors has been accepted in the IEEE Robotics and Automation Letters. Check out the paper, the video, and the source code.

January 12, 2021

Paper accepted in IJCV!


Our work on generating accurate reference poses for visual localization datasets has been accepted in the International Journal of Computer Vision. Check out the paper here, and the Aachen Day-Night v1.1 dataset in the paper can be accessed via the online visual localization benchmark service.

January 11, 2021

Check our new startup SUIND!


We are super excited to announce SUIND, our latest spin-off! Leveraging years of research in our lab, SUIND is building a groundbreaking safety suite for drones. Proud to see our former members Kunal Shrivastava and Kevin Kleber making a true impact in the industry! Read more here.

December 4, 2020

Titus Cieslewski successfully passed his Ph.D. defense


Congratulations to Titus Cieslewski, who has successfully defended his PhD dissertation titled "Decentralized Multi-Agent Visual SLAM", on Nov. 30, 2020. We thank the reviewers: Prof. Marc Pollefeys and Prof. Torsten Sattler!

Titus' major contributions have been:

  1. The first O(n²) → O(n) decentralized place recognition algorithm for multi-agent SLAM.
  2. A "succinct" feature detector, named SIPs, that extracts a minimal set of feature points to enable accurate camera localization.
  3. A completely new approach to feature detection and matching, named IMPs, where features that are implicitly matched between images are detected, thus rendering feature descriptors obsolete in the considered application case.
  4. A data representation for exploration which enables exploration using a globally inconsistent state estimate, thus showing that optimization is not needed for exploration.
The full video of the PhD defense presentation is on YouTube.

November 30, 2020

Event Cameras meet the CARLA simulator


Autonomous cars equipped with event cameras are now possible with our new CARLA simulator plugin! Based on ESIM and available since CARLA 0.9.10. The sensor generates synthetic events in photorealistic self-driving scenarios at any temporal resolution. You might use our script in here to generate reliable data in CARLA and create your own dataset. A step forward to bring event cameras into autonomous driving research!

November 28, 2020

Code Release of E2DEPTH


We have released the code of our 3DV 2020 paper Learning Monocular Dense Depth from Events, where we propose a supervised learning approach using a recurrent network to leverage the temporal consistency in the events and estimate single-camera dense depth! The code is available in GitHub!

November 18, 2020

Davide Scaramuzza invited speaker at Robotics Today


Davide Scaramuzza talks about Learning to Fly at the Robotics Today's seminar series organized by MIT and Stanford. He covers topics ranging from perception to planning and control, from model-based to model-free autonomy. He shows how to learn sensorimotor policies end-to-end directly in simulation, which transfer to real drones without any fine-tuning, thanks to the use of appropriate sensory abstractions. He talks about the role of simulation. Finally, he shows the latest and greatest on event cameras to enable low-latency agile flight. Watch the presentation! Enjoy!

November 18, 2020

Robohub interviews Davide Scaramuzza


Are you curious about the people behind the robots? The 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) features a new Original Series called Real Roboticist hosted by Sabine Hauert, President of Robohub and faculty at University of Bristol, where she interviews different roboticists individually. Watch the one with Davide Scaramuzza!

November 2-3, 2020

Join our IROS2020 Workshop on Perception, Learning, and Control for Autonomous Agile Vehicles on Nov. 2-3, 2020!


Join our IROS 2020 workshop "Perception, Learning, and Control for Autonomous Agile Vehicles" on November 2 and 3 via ZOOM organized by Loianno Giuseppe, Davide Scaramuzza, and Sertac Karaman. It will cover both ground and flying super agile vehicles. We have an incredible line of speakers from both academia and industry! We will award a $500 USD to the best workshop paper. A Field Robotics Special Issue will be organized after the workshop (submissions open to everyone). WHEN: November 2nd and November 3rd, 2020, from 16:00hrs to 20:00hrs Zurich time (10am to 2pm New York time).
WHERE: ZOOM.
Workshop webpage.

October 26, 2020

Code Release of PD-MeshNet


We have just released the code of our NeurIPS 2020 paper Primal-Dual Mesh Convolutional Neural Networks. Our network architecture obtains state-of-the-art results in the tasks of shape classification and segmentation! If you are interested in 3D data processing and geometric deep learning, you should try our code out! The code is available at this page!

October 05, 2020

Code Release for High-MPC


We release the open-source code for High-MPC: Learning High-level Policies for Model Predictive Control. Checkout the video and paper for more details.

September 26, 2020

We are hiring!
Come build the future of robotics with us!


We have three fully-funded openings for PhD students and Postdocs in computer vision and machine learning to contribute to the areas of:

  • Autonomous drone racing,
  • Autonomous inspection of power lines,
  • SLAM, Scene Understanding, and Computational Photography with Event Cameras.
Job descriptions and how to apply.

September 21, 2020

New Website, Automatically Evaluated Benchmark with Leader Board for UZH FPV Dataset


The UZH FPV dataset has a new home at fpv.ifi.uzh.ch! This new website features an automatically evaluated benchmark. Submit your VIO output until September 27th to participate in the IROS 2020 FPV VIO competition!

September 08, 2020

Code Release for Fisher Information Field


We release the open-source code for Fisher Information Field - an efficient and differentiable map for perception-aware planning. Checkout the video and paper for more details.

September 03, 2020

Code Release for Flightmare: A Flexible Quadrotor Simulator


We release our open-source quadrotor simulator Flightmare: A Flexible Quadrotor Simulator. Checkout the Website for more details.

July 27, 2020

Vision Systems article about our dynamic obstacle avoidance drone


Event cameras enable fast image processing and nimble movement. Here is the Link to the article.

July 26, 2020

Blog Post About Deep Drone Acrobatics


Do you want to know how to make autonomous drones fly acrobatics maneuvers? Check out this blog post! Teaser: All you need is a drone simulator!

July 23, 2020

Code Release for VIMO: Simultaneous Visual Inertial Model-based Odometry and Force Estimation


We released our open-source example implementation of VIMO extends VINS-Mono to estimate external forces along with robot state. Instructions to run the code are in the github repository.

July 16, 2020

Best Systems Paper Award at RSS and Best Paper Award Finalist!


Our paper AlphaPilot: Autonomous Drone Racing (Paper, Presentation) won the Best System Paper Award at RSS! Additionally, our paper Deep Drone Acrobatics (Paper, Presentation) was finalist for the Best Paper Award.

June 24, 2020

Davide Scaramuzza participates in the ICRA20 Debate on the Future of Robotics Research!


Is "robotics research over-reliant on benchmark datasets and simulation"? Check out here what RPG director Davide Scaramuzza has to say: Video. The debate had over 1100 online viewers!

June 11, 2020

Deep Drone Acrobatics paper at RSS 2020!


Drones with on-board sensing and computation can now fly agile acrobatic maneuvers! Check out our RSS 2020 paper Deep Drone Acrobatics to understand what made this possible! A video of the experiments is available at this link. If you also want to do acrobatics with drones, please check out the project's code!

June 2, 2020

We are hiring!
Come build the future of robotics with us!


We have several fully-funded openings for PhD students and Postdocs in control, path planning, aerodynamic modelling, numerical optimization, computer vision, and machine learning to contribute to the areas of:

  • Autonomous drone racing,
  • Autonomous inspection of power lines,
  • Computational photography.
Job descriptions and how to apply.

June 2, 2020

Results of ICRA 2020 UZH FPV Competition Available

The winner is OKVIS 2.0 by the Smart Robotics Lab, Imperial College, London, closely followed by OpenVINS from the University of Delaware and with OSU-ETHZ, a joint team from the Ohio State University and ETH Zurich, on a close third rank.
See detailed results on the dataset page.

May 28, 2020

RSS20 Workshop on Agile Super Vehicles, Call for Papers


Philipp Foehn and Davide Scaramuzza, from RPG, along with Varun Mural and Sertac Karaman, from MIT, organize the 2nd RSS Workshop on Perception and Control for Fast and Agile Super-Vehicles. We will have a great line of speakers from academia and industry but we also accept manuscripts.

Abstract deadline: June 14.

Notification of acceptance: June 21.

Workshop: July 12, 2020, virtual event.

Workshop webpage:
https://mit-fast.github.io/WorkshopRSS20SuperVehicles/

Please email all submissions to
super-vehicles-rss20-submit@mit.edu
with RSS20 Super Vehicles in the subject line.

May 28, 2020

RPG organizes 3rd FPV Drone Racing VIO competition


The competition will be held jointly with the 6th edition of the IROS 2020 Workshop on "Perception, Learning, and Control for Autonomous Agile Vehicles".

The participants will run their VIO algorithms on datasets (including images, IMU measurements and event data) recorded with a FPV drone racing quadrotor flown by an expert pilot with speeds up to and over 20m/s. More information at https://fpv.ifi.uzh.ch.

May 27, 2020

Our Master student Tim Taubner wins ETH Medal for Best Master thesis!

Tim Taubner, who did his Master thesis Competitive Drone Racing via Pass-Block Games at both Stanford University and RPG has received the ETH Medal 2020 and the Willi Studer Prize for the best student in the ETH Master Robotics, Systems and Control in the period March 2019-2020.

May 25, 2020

AlphaPilot Paper at RSS 2020


Check out our performance at the 2019 AlphaPilot Challenge now in video. Additionally, we present our paper at RSS 2020, describing our approach combining learning and model-based techniques to rank second in the 2019 AlphaPilot Challenge.

May 24, 2020

Code Release


We released the code of our paper "Event-Based Angular Velocity Regression with Spiking Networks". The spiking neural network, implemented in PyTorch and CUDA, regresses the angular velocity of an event camera. Along with the code, there are instructions to download the datasets. Check out the github repository.

April 1, 2020

Code Release


We have just released the code of our paper "Video to Events: Recycling Recycling Video Dataset for Event Cameras". Use our code implemented in CPP and Python to generate artificial events from standard video. The code can be found here.

March 30, 2020

Code Release


Our new GPU-optmized FAST detector is available on github. It implements a GPU-specific novel non-maximum suppression and enhanced FAST detector, achieving over 1000fps on a Jetson TX2. Check out our paper Faster than FAST.

March 20, 2020

RPG research aired on SRF and 3Sat TV!

This documentary (German only) demonstrates our research on how autonomous drones can be used for search and rescue. (video starts at time 15:50)


March 18, 2020

On the coverpage of Science Robotics!

Our Science Robotics paper on "Dynamic obstacle avoidance for quadrotors with event cameras" conquers the coverpage of Science Robotics March issue! PDF, Video.


March 8, 2020

Henri Rebecq finalist in the Georges Giralt PhD Award!

Henri Rebecq is finalist in the 2019's edition of the George Giralt European PhD Thesis Award out of 60 applications.


February 18, 2020

Open positions in Vision-based Control for Agile Flight


We have openings at both PhD and Postdoctoral levels in Vision-based Control for Agile Flight, such as autonomous drone racing. More info here.


February 13, 2020

Code Release


We have just released the code of our RA-L and ICRA paper A General Framework for Uncertainty Estimation in Deep Learning. Our framework can compute uncertainties for every network architecture, does not require changes in the optimization process, and can be applied to already trained architectures. Our framework's code is available at this page!

February 7, 2020

Release of Driving Event Camera Datasets


We are excited to release many driving datasets recorded in the context of our T-PAMI paper High Speed and High Dynamic Range Video with an Event Camera. The datasets consist of a number of sequences that were recorded with a VGA (640x480) event camera (Samsung DVS Gen3) and a conventional RGB camera (Huawei P20 Pro) placed on the windshield of a car driving through Zurich. The driving datasets are available at this page.

December 10, 2019

RPG awarded 2 million Euros from the European Research Council!


The European Research Council awards Davide Scaramuzza a Consolidator Grant (2 million Euros) for a research project that will use event cameras to improve the performance of flying robots in rescue operations. Press release by UZH Press release by the EU commission List of ERC grantees Description of the ERC program


December 6, 2019

RPG ranks 2nd at Autonomous Drone Racing World Championship!


The team composed by Dario Brescianini, Philipp Foehn, Elia Kaufmann and Mathias Gehrig ranks 2nd in the AlphaPilot Autonomous Drone Racing World Championship! Congratulations! (UZH News, HeroX)


December 5, 2019

Davide Falanga successfully passed his PhD defense


Congratulations to Davide Falanga, who has successfully defended his PhD dissertation titled "Agile, Vision-Based Quadrotor Flight: from Active, Low-Latency Perception to Adaptive Morphology", on Dec. 2, 2019. We thank the reviewers: Prof. Nathan Michael, Prof. Sami Haddadin and Prof. Roland Siegwart!


December 1, 2019

SNSF Bridge and Venture Kick awarded


Kunal Shrivastava and Kevin Kleber receive 140k CHF from the SNSF BRIDGE and Venture Kick funds for translating their research into product! Congratulations!


November 21, 2019

Henri Rebecq successfully passed his PhD defense


Congratulations to Henri Rebecq, who has successfully defended his PhD dissertation titled "Event Cameras, from SLAM to High Speed Video", on Nov. 18, 2019. We thank the reviewers: Prof. Andrew Davison, Prof. Tobi Delbruck and Prof. Bernt Schiele!


October 28, 2019

Davide Scaramuzza keynote speaker at IROS 2019


We are proud to announce that on November 5, 2019, Davide Scaramuzza will deliver a keynote talk at IROS 2019 in Macau.


October 28, 2019

EKLT open-sourced


EKLT, our event-based feature tracking method is now available open source. By leveraging the complementarity of event cameras and standard cameras EKLT achieves unprecedented tracking accuracy with high temporal resolution. https://github.com/uzh-rpg/rpg_eklt

October 15, 2019

IROS19 Workshop on Vision-Based Drones


Davide Scaramuzza and Giuseppe Loianno organize the 5th IROS workshop on vision-based drones! We have a great line of speakers from academia and industry! Workshop webpage.

October 12, 2019

Oculus Zurich expanding to 200 employees


Oculus Zurich (former Zurich Eye) soon expanding to 200 employees (currently 80)! Very proud of you guys! Article Handelszeitung.

September 4, 2019

RPG wins the NASA Tech Briefs "Create the Future" contest for the category Aerospace and Defense


Our foldable drone, the first quadrotor that can change its shape and size in flight, is the winner of the Aerospace and Defense category in the 2019 NASA Tech Briefs "Create the Future" contest. Check out the winners list.

September 4, 2019

Davide Scaramuzza elected IEEE Senior Member

We are proud to announce that Prof. Davide Scaramuzza has been elected to the grade of Senior Member of the IEEE. Congratulations!

September 2, 2019

Guillermo Gallego starts as Professor at TU Berlin!

We congratulate our former postdoc Guillermo Gallego who starts today to work as Associate Professor at TU Berlin.

Guillermo worked on event-based algorithms. His major contributions are the release of the first event-camera dataset, which has become a standard tool in the computer vision community, and a method, called "focus maximization", which solves multiple computer vision and machine learning problems with event cameras. He was also the main author of the survey paper on event-based vision. Guillermo's Personal homepage.

August 20, 2019

Davide Scaramuzza keynote speaker at FSR 2019!

Davide Scaramuzza will deliver a keynote speech at the international conference on Field and Service Robotics, in Tokyo, on September 31. More information here.

August 26, 2019

From Zurich-Eye to Oculus Quest

Learn how Zurich-Eye, co-founded by former RPG members Christian Forster, Matia Pizzoli, and Manuel Werlberger contributed to the newly announced Oculus Insight and Oculus Quest (link)!

August 20, 2019

Interest point work open-sourced

Our recent work on interest points, which focuses on minimal representations for relative pose estimation, is now available open source.

SIPs achieves high matching score at low point counts, but uses existing descriptors for matching:
https://github.com/uzh-rpg/sips2_open

IMIPs instead provides a set of points that implicitly match between views, without the need for descriptors:
https://github.com/uzh-rpg/imips_open

August 13, 2019

RPG featured on BBC News

Our foldable drone was featured in a documentary by the BBC News Arabic. Check-out the video here.

July 29, 2019

RPG organizes the first FPV Drone Racing VIO competition

The competition will be held jointly with the 5th edition of the IROS 2019 Workshop on "Challenges in Vision-based Drone Navigation", which will take place on November 8, 2019, in Macau.

The participants will run their VIO algorithms on datasets (including images, IMU measurements and event data) recorded with a FPV drone racing quadrotor flown by an expert pilot with speeds up to and over 20m/s. More information here.

July 29, 2019

Survey paper on Rescue Robotics on JFR

The Swiss National Centre of Research (NCCCR) Robotics published a joint paper on the current state and future outlook of rescue robotics in the Journal of Field Robotics. Paper: PDF.

July 29, 2019

RPG featured on NZZ

Our lab's work towards the Alphapilot autonomous drone racing competition was featured on Neuer Zurcher Zeitung (NZZ). Check out the article here for more details (only in German).

July 9, 2019

Video recordings and Slides of the CVPR 2019 Workshop on Event-based Vision and Smart Cameras are now available!

On June 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG), and Kostas Daniilidis (UPenn) organized the 2nd International Workshop on Event-based Vision and Smart Cameras at CVPR, Long Beach.


July 8, 2019

Our PhD student Manasi Muglikar wins the ETH Robotics Summer School Challenge

Manasi Muglikar, PhD student in our lab, won the ETH Robotics Summer School Robot Competition. Congratulations!


July 8, 2019

Davide Scaramuzza, Stanford University, and Microsoft organize Drone Racing Competition at NeurIPS 2019!

Game of Drones is a NeurIPS 2019 competition with the goal to push the boundary of building competitive autonomous systems through head-to-head drone races. Check out the official webpage for further details.

June 10, 2019

Join us on June 17th for the CVPR 2019 Workshop on Event-based Vision and Smart Cameras!

On June 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG), and Kostas Daniilidis (UPenn) are organizing the 2nd International Workshop on Event-based Vision and Smart Cameras at CVPR in Long Beach.


Check out the schedule, accepted papers and live demos for this full-day workshop. We will have top speakers from both academia and industry (Samsung, Intel, Prophesee, iniVation, Insightness, CelePixel).


June 10, 2019

RPG selected to participate in the AlphaPilot Autonomous Drone Racing Competition

We are proud to announce that our lab is one of the nine teams that were accepted into the 2019 AlphaPilot Innovation Challenge, where we will compete to design an AI framework capable of piloting racing drones through high-speed aerial courses without any GPS, data relay or human intervention. The competition has a $1 million cash prize, sponsored by Lockheed Martin. Check out the official press release for further information.


June 5, 2019

Davide Scaramuzza talks about drone racing on IEEE Spectrum

Check out Prof. Davide Scaramuzza's interview with IEEE Spectrum about the Alphapilot competition, why drone racing matters for robotic research, and our recently released UZH FPV Drone Racing Dataset.


May 24, 2019

Best Paper Award for Zichao Zhang!

Zichao Zhang, PhD student in our lab, received the Best Paper Award at the ICRA 2019 Workshop on SLAM Benchmarking in Montreal on May 24 with his paper titled "Rethinking Trajectory Evaluation for SLAM: a Probabilistic, Continuous-Time Approach"! Congratulations!


May 22, 2019

UltimateSLAM receives the IEEE RAL'18 Best Paper Award Honourable Mention

Our paper UltimateSLAM received the IEEE Robotics and Automation Letters 2018 Best Paper Award Honourable Mention during the award session at the ICRA 2019 conference in Montreal. It ranked in the top 3 out of 520 papers published by RAL in 2018. Read the paper here and watch the video here for more details.


May 21, 2019

RPG releases the UZH-FPV Drone Racing Dataset

We are happy to announce the release of the UZH-FVP Drone Racing Dataset. It contains over 30 sequences of data from event cameras, standard cameras, IMU, and ground truth recorded by an FPV drone flown up to over 20m/s by professional drone pilots in real-world scenarios! Check out the official web page for more details.


May 13, 2019

RPG research on dynamic obstacle avoidance with event cameras featured on IEEE Spectrum

Our latest work on quadrotor flight with event cameras was featured on IEEE Spectrum. For more details, read the paper here and watch the video here.


May 13, 2019

PULP-DroNet: The First Fully Autonomous Nano-Size UAV

In collaboration with researchers at the digital Circuits and Systems lab at ETH, we have deviced a nano-drone (few centimeters in diameter) which can navigate in indoor environments with only onboard sensing and computing. Video, Paper, Code.


May 10, 2019

New Postdoc and Drone Engineer

We welcome Dr. Dimche Kostadinov as new Postdoc and Thomas Längle as new Drone Engineer in our lab!


May 8, 2019

Our PhD student Daniel Gehrig wins ETH Medal for Best Master thesis!

Daniel Gehrig, former Master student and current PhD student in our lab, won the ETH Medal for his outstanding Master thesis! Congratulations! Check out his ECCV'18 paper here, which is based on his Master thesis.


May 8, 2019

"UltimateSLAM" nominated IEEE RAL Best Paper Award finalist (in the top 3 out of 520 papers!)

Our paper "Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios" was nominated finalist for the 2018 IEEE Robotics and Automation Letters Best Paper Award. Our paper is in the top 3 out of 520 papers published by RAL in 2018. Read the paper here and watch the video here.


May 7, 2019

Rapid Dynamic Obstacle Avoidance with Event Camera

Our paper "How Fast is Too Fast? The Role of Perception Latency in High-Speed Sense and Avoid" has been accepted for publication in the Robotics and Automation Letters (RA-L) 2019. We analyze the role of perception latency, sensing range and actuation limitatations on the maximum speed a robot can reach to safely navigate in an unknown environment. Our analysis is supported by experimental evaluation, where a quadrotor equipped with an event camera is able to avoid an obstacle moving towards it at 10 m/s. Read the paper here and watch the video here.


April 23, 2019

Survey paper on Event-based Vision!

Guillermo Gallego, Davide Scaramuzza and 10 other international experts wrote a joint, 25-page-long survey paper on event-based cameras, from their working principle to algorithms and applications. Read the paper here.


April 09, 2019

RPG wins the Drone Hero Award 2019

Our work on the foldable drone, the first quadrotor able to change morphology in flight to adapt its shape and size to different tasks, won the Drone Hero Award Contest 2019 for the category Innovative Drone. Read the paper here and watch the video here.


March 26, 2019

RPG Featured on The New York Times

Our research on autonomous drone racing was featured on The New York Times: "A drone from the University of Zurich is an engineering and technical marvel...". Check out the article!


March 15, 2019

CVPR 2019 Workshop on Event-based Vision and Smart Cameras - Call for Papers and Demos!

Call for papers and demos! On June 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG), and Kostas Daniilidis (UPenn) will organize the 2nd Workshop on Event-based Vision and Smart Cameras at CVPR in Long Beach.


Check out our speakers lineup.


March 11, 2019

RPG news reach over 200 million readers in 2018!

Our lab received an impressive media coverage during 2018, with more than 200 million readers across the world.


February 25, 2019

Code Release - Feature Tracking Analysis for Event Cameras


We release a framework to evaluate feature tracking for an event camera. The code provided is implemented in Python and produces paper-ready plots and videos for event-based feature tracks.

Paper, YouTube, Code on Github.


January 29, 2019

RPG received huge media coverage

Our paper titled "The Foldable Drone: A Morphing Quadrotor that can Squeeze and Fly" received great attention from the media. It was covered by several newspaper and magazines, among which: TechCrunch, The Verge, CNET, La Repubblica, Tages Anzeiger, Popular Mechanics and IEEE Spectrum. Read the paper here and watch the video here.


January 18, 2019

CVPR 2019 Workshop on Event-based Vision

On June 16 and 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG) and Kostas Daniilidis (UPenn) will organize a workshop at CVPR in Long Beach about Event-based Vision.


Check out the speakers lineup on the workshop website.


December 18, 2018

Code Release - EMVS: Event-based Multi-View Stereo


We release the code for Event-based Multi-View Stereo (EMVS): 3D reconstruction with an event camera. The code provided is implemented in C++ and produces accurate, semi-dense depth maps without requiring any explicit data association or intensity estimation. The code runs in real-time on a CPU.

Paper, YouTube, Code on Github.


December 13, 2018

Paper accepted in RA-L 2018!


Our paper The Foldable Drone: A Morphing Quadrotor that can Squeeze and Fly has been accepted for publication in the Robotics and Automation Letters. Read the paper here and watch the video here.


December 6, 2018

Code Release - PAMPC: Perception-Aware Model Predictive Control for Quadrotors

We are excited to announce that our RPG control framework gets a new addition: our Perception-Aware Model Predictive Control (PAMPC) is opensource and the source code is available here. PAMPC combines control and planning in one solution and allows to not only achieve an action objective but also compromise it with a perception objective. Further details are available in our paper presented at IROS 2018. Video.


November 15, 2018

Open positions in Robotics, Machine Learning, Reinforcement Learning and Event-Based vision

We have several openings at both PhD and Postdoctoral levels in Robotics, Machine learning, Reinforcement Learning, Control, Computer Vision, Event Cameras, and beyond. Info and how to apply here.


November 3, 2018

We released ESIM: our open source event camera simulator


We release ESIM, our new event camera simulator. ESIM can simulate events accurately and efficiently, as well as other sensors such as a conventional camera (including motion blur!), and an inertial measurement unit (IMU). ESIM readily provides ground truth depth and optic flow maps. Multiple rendering engines are available, including a photorealistic rendering engine based on Unreal Engine, and a fast 3D engine based on OpenGL that can simulate events in real-time.

Paper, YouTube, Project Page, Code on Github.


November 1, 2018

CoRL Best System Paper Award


Our paper Deep Drone Racing: Learning Agile Flight in Dynamic Environments won the Best Systems Paper Award at the Conference on Robotic Learning (CoRL) 2018!


October 16, 2018

We released the paper with which we won IROS 2018 Autonomous Drone Race


Upon large requests, we decided to release on Arxiv the PDF of the paper describing the approach with which we won the IROS 2018 Autonomous Drone Race. Our approach fuses deep learning and optimal control to achieve the ultimate flight performance. For these reasons we titled the paper: Beauty and the Beast. Who is the Beauty and who is the Beast? ;-) Paper, YouTube.


October 3, 2018

RPG won the IROS 2018 Autonomous Drone Race


We are proud to announce that our team won the IROS Autonomous Drone Race Competition, passing all 8 gates in just 30 seconds! In order to succeed, we combined deep networks, local VIO, Kalman filtering, and optimal control. Watch our performance here.


September 26, 2018

Zuckerberg announced Zurich-Eye built Oculus Quest!


Mark Zuckerberg just announced the new Oculus VR headset, called Oculus Quest. This is what our former lab startup, Zurich Eye, now Oculus Zurich has been working on for the past two years. Watch the video.


September 21, 2018

IROS 2018 Workshop: "Vision-based Drones: What's Next?"


On October 5, 2018, Giuseppe Loianno (New York University), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize a workshop at IROS in Madrid about "Vision-based Drones: What's Next?". Check out the speakers lineup on the workshop website.


September 21, 2018

RPG will take part in the IROS 2018 Autonomous Drone Race


Our lab will partecipate in the IROS 2018 Autonomous Drone Race in Madrid. Further details are available here.


September 2, 2018

RPG live demo at Langen Nacht der Zurcher Museen


We performed a live quadrotor demo at the Zurich Kunsthalle during the Langen Nacht der Zurcher Museen, as part of the 100 Ways of Thinking show, in front of more than 200 people. Check out the media coverage here.


August 30, 2018

RPG featured on NZZ

Our research was feature on Neue Zucher Zeitung. Check out the article here.


August 28, 2018

Huge media coverage for search and rescue demonstration

Our lab received great Swiss media attention (NZZ, SwissInfo, SRF) for our live flight demonstration of a quadrotor entering a collapsed building to simulate a search and rescue operation. Check out the video here.


August 7, 2018

RPG demo featured on the Swiss Federal Office for Deference Procurement website

Our live demo of a quadrotor entering a collapsed building through a narrow gap was featured on the website of the Swiss Federal Office for Deference Procurement (armasuisse). More details are available here.


July 5, 2018

Facebook continues keeps expanding in Zurich

Facebook-Oculus Zurich, former Zurich-Eye, keeps expanding in Zurich: 35 employees and growing at a rate of 3 new people per month. RPG is very proud of them! More info here.


July 25, 2018

Two papers accepted at ECCV 2018!

Our papers on Asynchronous feature tracking using events and frames and on Stereo 3D reconstruction for SLAM have been accepted at ECCV 2018 in Munich! Check out our research page on event-based vision.


July 10, 2018

RPG research featured on NewScientist

Our research on autonomous drone racing was featured on NewScientist. Check out the article here.


June 13, 2018

Paper accepted in RA-L 2018!

Our paper about safe quadrotor navigation computing forward reachable sets was accepted for publication in the Robotics and Automation Letters (RA-L) 2018. Check out the PDF.


June 11, 2018

Paper accepted at RSS 2018!

Our paper about drone racing was accepted to RSS 2018 in Pittsburgh! Check out the long version, short version and the video!


June 10, 2018

Paper accepted at IEEE TRO

Our paper on Continuous-Time Visual-Inertial Odometry for Event Cameras has been accepted for publication at Transactions of Robotics. Check out the paper.


June 1, 2018

New Postdoc

We welcome Dr. Dario Brescianini as new Postdoc in our lab!


May 28, 2018

RPG receives 2017 IEEE Transactions on Robotics (TRO) best paper award



Our paper on IMU pre-integration received the 2017 IEEE Transactions on Robotics (TRO) best paper award at ICRA 2018 in Brisbane, Australia. Check out the paper here. Press coverage!


May 14, 2018

IEEE TRO Best Paper Award

We are proud to announce that our paper on IMU pre-integration will receive the 2017 IEEE Transactions on Robotics (TRO) best paper award. On this occasion, IEEE made the article open access for the next ten years! Press coverage

C. Forster, L. Carlone, F. Dellaert, D. Scaramuzza

On-Manifold Preintegration for Real-Time Visual-Inertial Odometry

IEEE Transactions on Robotics, vol 33, no. 1, pp. 1-21, Feb. 2017.

PDF DOI YouTube


May 11, 2018

Two papers accepted at CVPR 2018!

Our papers on A unifying contrast maximization framework for event cameras and on Steering angle prediction for self-driving cars with event cameras have been accepted at CVPR 2018 in Salt Lake City! Check out our research page on event-based vision.


May 10, 2018

Qualcomm Innovation Fellowship


Henri Rebecq, a PhD student in our lab, won a Qualcomm Innovation Fellowship with his proposal "Learning Representations for Low-Latency Perception with Frame and Event-based Cameras"!


April 26, 2018

Release of NetVLAD in Python/Tensorflow

We are happy to announce a Python/Tensorflow port of the FULL NetVLAD network, approved by the original authors and available here (see also our software/datasets page). The repository contains code which allows plug-and-play python deployment of the best off-the-shelf model made available by the authors. We have thoroughly tested that the ported model produces a similar output to the original Matlab implementation, as well as excellent place recognition performance on KITTI 00.


April 24, 2018

Release of Data-Efficient Decentralized Visual SLAM

We provide the code accompanying our recent Decentralized Visual SLAM paper. The code contains a C++/Matlab simulation containing all building blocks for a state-of-the-art decentralized visual SLAM system. Check out the paper, the Video Pitch, the presentation and the code.


April 23, 2018

Release of the Fast Event-based Corner Detector

We provide the code of our FAST event-based corner detector. Our implementation is capable of processing millions of events per second on a single core (less than a micro-second per event) and reduces the event rate by a factor of 10 to 20. Check out our Paper, video, and code.


April 23, 2018

Release of the RPG Quadrotor Control Framework

We provide a complete framework for flying quadrotors based on control algorithms developed by the Robotics and Perception Group. We also provide an interface to the RotorS Gazebo plugins to use our algorithms in simulation. Check out our software page and the Github repository for more details.


April 12, 2018

Davide Scaramuzza gives an invited seminar at Princeton University


March 14, 2018

Henri Rebecq finalist at the Qualcomm Innovation Award Fellowship

Henri Rebecq, PhD student in our lab, was finalist at the Qualcomm Innovation Award Fellowship.


March 14, 2018

Christian Forster finalist in the Georges Giralt PhD Award!

Christian Forster, first PhD student to graduate from RPG, ranked second in the 2017's edition of the George Giralt European PhD Thesis Award out of 41 applications.


March 14, 2018

Huge media coverage for DroNet

Our recent work on how to teach a drone to fly autonomously and safely in the streets of a city (PDF) received a huge media coverage. Check out our media page.


March 07, 2018

RPG publishes event camera SLAM patent

Check out our publications page. Link from the PATENTSCOPE database.


March 02, 2018

RPG Impresses Swiss Government Representatives

RPG impresses Swiss minister Schneider-Ammann and other federal and cantonal representatives at the inauguration of the Innovation Park Switzerland. Check out here.


January 23, 2018

Davide Scaramuzza gives an invited talk at Microsoft Research Seattle

Check out the video here.


January 23, 2018

DroNet: Learning to Fly by Driving

We have deviced a Deep Neural Network, called DroNet, that teaches a drone how to fly autonomously and safely in the streets of a city, among other vehicles, by imitating the behavior or cars and bicycles! Video, Paper, Datasets.


January 1, 2018

New Postdoc

We welcome Dr. Peng Lu as new Postdoc in our lab!


December 14, 2017

Paper accepted in RA-L

Our work on differential flatness of quadrotor dynamics subject to rotor drag has been accepted for publication in the Robotics and Automation Letters. Read the paper here and watch the video here.


December 14, 2017

RPG in the NCCR integrative demo of aerial and terrestrial robots

Our group took part to the NCCR Robotics integrative demo of aerial and terrestrial robots for rescue missions. Check out the video here.


December 4, 2017

New PhD Student

We welcome Mathias Gehrig as new PhD student in our lab!


November 24, 2017

RPG celebrates 5-year anniversary!

In this clip, we summarize our main achievements, projects, awards, exhibitions, and upcoming videos! Watch our YouTube video!


November 7, 2017

Paper accepted in IJCV!

Our work on 3D reconstruction with an event camera in real-time has been accepted for publication in the International Journal of Computer Vision. Read the paper here.


October 31, 2017

Paper accepted in T-PAMI !

Our work on 6-DOF ego-motion estimation with an event camera has been accepted for publication in IEEE Trans. Patter Analysis and Machine Intelligence. Check out the paper here.


October 11, 2017

Our paper on efficient decentralized visual place recognition was accepted to MRS 2017!

Our recent work on decentralized visual place recognition from full-image descriptors was accepted to MRS 2017, the first international symposium on multi-robot and multi-agent systems!

Check out the paper here.


October 9, 2017

RPG ranks 2nd at the IROS 2017 Autonomous Drone Race

We ranked 2nd at the IROS 2017 Autonomous Drone Race in Vancouver. Check out the video of our performance here and the official website of the competition here.


October 9, 2017

RPG nominated Best Paper Award on Safety Security and Rescue Robotics Finalist at IROS 2017

Our paper "Rapid Exploration with Multi-Rotors: A Frontier Selection Method for High Speed Flight" was nominated finalist for the Best Paper Award on Safety Security and Rescue Robotics at IROS 2017 in Vancouver. Check out the paper here.


October 6, 2017

Davide Scaramuzza talks about Swiss vision startups in the Swiss ICT journal

Check out the article here.


October 5, 2017

RPG research on DVS featured on IEEE Spectrum

Our latest work on quadrotor flight with event cameras was featured on IEEE Spectrum. For more details, check our research page.


September 22, 2017

Davide Scaramuzza gives invited talk at GeorgiaTech Robotics Seminar Series

Davide Scaramuzza talks about autonomous, agile, vision-controlled drones and event cameras at the GeorgiaTech robotics seminar series: check out the video here.


September 20, 2016

Press release: RPG drones use event cameras to fly faster and even in the dark!

This is the first ever autonomous flight with an event camera, which demonstrates agile maneuvers and flying in low-light environments. Possible applications could include supporting rescue teams with search missions at dusk or dawn. Check out the press release, our video and our paper.


September 18, 2017

RPG organizes IROS'17 workshop on fast, vision-controlled MAVs

Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the fourth international workshop on Vision-based High Speed Autonomous Navigation of UAVs.


August 30, 2017

Davide Scaramuzza appointed Tenured Associate Professor

Davide Scaramuzza was appointed Tenured Associate Professor with double affiliation with the Insitute of Neuroinformatics of the University of Zurich and ETH Zurich. Check out the news here.


August 29, 2017

Open position as Drone Research Engineer

We have a new opening in our team for a Drone Research Engineer. See our open positions for more details.


August 29, 2017

RPG affiliated with the Institute of Neuroinformatics

We are happy to announce that our lab is now affiliated with the Institute of Neuroinformatics (INI), a joint institute belonging to the University of Zurich and ETH Zurich.


August 11, 2017

RPG receives Best Student Paper Award Finalist prize at RSS

Philipp Foehn, PhD student in our lab, and Naveen Kuppuswamy, former visiting researcher, received the Best Student Paper Award Finalist prize at RSS 2017 in Boston for our work on trajectory optimization for agile quadrotor maneuvers with cable-suspended payloads. Check out the paper here and the video here.


July 14, 2017

Binaries for SVO 2.0 released

We are happy to release the binaries of our Semi-Direct Visual Odometry, SVO 2.0. It can run up to 400 frames per second on a modern laptop and execute in real-time on a smartphone processor. The binaries can be found here.


July 12, 2017

List of Event-based Vision Resources available

We are happy to start The List of Event-based Vision Resources, which contains links to event camera devices as well as papers, videos, code, etc. describing the algorithms and systems developed using this exciting technology.
We hope the list will help newcomers to the field to get started with this technology by directing them to the appropriate references. Help us improve the list by adding more entries! (Please follow the "Contributing" guidelines).


June 21, 2017

Code for image reconstruction from an event camera released

We are happy to announce the release of the code for recovering the brigthness map that caused the events to be triggered. The code can be found here.


June 20, 2017

RSS'17 Best Student Paper Award Finalist

Our paper Fast Trajectory Optimization for Agile Quadrotor Maneuvers with a Cable-Suspended Payload, accepted for oral presentation at RSS'17, was nominated as a finalist for the Best Student Paper Award! Check out the paper here and the video here.


June 15, 2017

Slides and Videos of the ICRA17 International Workshop on Event-based Vision are out!

The slides and videos of the talks of the International Workshop on Event-based Vision are now available at the workshop website and the RPG Workshops Youtube channel.


June 15, 2017

Code for Event Lifetime released

We are happy to announce the release of the code for event lifetime. The lifetime of an event is the time that it takes for the moving brightness gradient causing the event to travel a distance of 1 pixel. More details in the following ICRA publication. The code can be found here.


May 12, 2017

Davide Scaramuzza talks at Maker Festival of his home town!

Prof. Davide Scaramuzza gave a talk at the Maker Festival in Terni about the role of computer vision in autonomous cars and drones. Check the news here (Italian).


May 5, 2017

RPG wins the 2017 Misha Mahowald Prize for Neuromorphic Engineering!

The Robotics and Perception Group wins the 2017 Misha Mahowald Prize, which recognizes outstanding achievement in the field of neuromorphic engineering. Check out the press release here. Watch out our summary video on Event-based Vision for Autonomous High-Speed Robotics.


April 3, 2017

Our paper on Volumetric Information Gain Metrics for Active 3D Reconstruction has been published by Autonomous Robots!

Our recent work comparing volumetric information gain metrics for object reconstruction is part of the Autonomous Robots special issue on Active Vision.

Check out the paper here.


April 20, 2017

Our PhD student Antonio Loquercio wins ETH Medal for Best Master thesis!

Antonio Loquercio, PhD student in our lab, won the ETH Medal for his outstanding Master thesis! Congratulations!


April 4, 2017

The Zurich Urban Micro Aerial Vehicle Dataset released!

We are happy to announce the release of the first public, large-scale dataset recorded with a drone in an urban environment at low altitudes (5-15m). The 2 km dataset consists of time synchronized aerial high-resolution images, GPS and IMU sensor data, ground-level street view images, and ground truth data. The dataset is ideal to evaluate and benchmark appearance-based localization, monocular visual odometry, simultaneous localization and mapping (SLAM), and online 3D reconstruction algorithms for MAVs in urban environments. Go to the dataset webpage.


April 3, 2017

Our paper on Collaborative Transport with Multiple MAVs appeared on Discovery Channel

Our paper on Collaborative Transport with Multiple MAVs appeared on Discovery Channel Canada. Check out the paper here.


March 31, 2017

New Postdoc and PhD Student

We welcome Dr. Suseong Kim, as new Postdoc, and Philipp Foehn, as new PhD student, in our lab!


March 24, 2017

Fotokite won EUrobotics transfer award

Fotokite, the Swiss startup developing tethered drones, which was incubated within RPG in 2014 through the NCCR Spin Fund, won the 2017 EUrobotics Tech Transfer Award! Congratulations! We are very proud of you!


February 24, 2017

Davide Scaramuzza's CMU seminar on IEEE Spectrum

Davide Scaramuzza's seminar on visual-inertial state estimation, active vision, and event-based vision at CMU was featured on IEEE Spectrum. Watch the video on YouTube.


February 17, 2017

Davide Scaramuzza gives invited talk at CMU Robotics Seminar Series

Davide Scaramuzza talks about visual-inertial state estimation, active vision, and event-based vision at CMU Robotics Institute Seminar Series: YouTube, Abstract.


February 16, 2017

RPG research among the 3 most popular news of the University of Zurich of 2016

Our research on autonomous drones was featured on the University of Zurich Journal, ranking 3rd on the list of the most popular news release of 2016. Check out the UZH Journal on Page 5.


February 15, 2017

Magic Leap buys Dacuda's 3D Division

Dacuda's 3D Division, a long-standing collaborator of RPG, got acquired by Magic Leap, the unicorn of Augmented Reality (news here). One amazing result of our collaboration was a software that runs on a smartphone and delivers at the same time an immersive virtual-reality experience like high-end VR headsets. This software was the result of a great project with RPG, which was demoed at CES 2017 (LINK). Congratulations guys!.


February 15, 2017

RPG organizes ICRA'17 Workshop on Event-based Vision

Davide Scaramuzza, Andrea Censi (MIT), and Guillermo Gallego (RPG-UZH) are organizing the first International Workshop on Event-based Vision.


Febraury 6, 2017

Open Postdoc positions in Deep Learning and Computer Vision for Robotics

For info and applications, please see here.


February 1, 2017

New PhD Student

Welcome to Antonio Loquercio as a new PhD student in our lab!


January 9, 2017

Our paper on efficient decentralized visual place recognition was accepted to RA-L!

Our recent work on decentralized visual place recognition using a distributed inverted index was accepted to RA-L!

Check out the paper here.


December 20, 2016

Our paper EVO (Event-based, 6-DOF Parallel Tracking and Mapping in Real-Time) was accepted to RA-L!

Our recent work on real-time parallel tracking and mapping with an event camera was accepted to RA-L!

Check out the video here, and the paper here.


December 20, 2016

Our paper on Accurate Angular Velocity Estimation with an Event Camera was accepted to RA-L!

Our recent work on motion estimation with an event camera by contrast maximization was accepted to RA-L!

Check out the video here, and the paper here.


December 14, 2016

Our research on aggressive flight through narrow gaps with active monocular vision is on MIT Technology Review

Our recent work on quadrotor flight through narrow gaps using only onboard sensing and computing is featured on MIT Technology Review. Click here to read the article.


December 3, 2016

Our paper on Agile Quadrotor Flight through Narrow Gaps available on Arxiv

Our recent work "Aggressive Quadrotor Flight through Narrow Gaps witn Onboard Sensing and Computing" is available on Arxiv for download. [Link]


November 19, 2016

RPG featured in the World Robotics report of the International Federation of Robotics

Our lab was featured in the 2016 World Robotics report of the International Federation of Robotics as outstanding profile of research lab in service robotics. Check out the report here.


November 14, 2016

Open PhD and Postdoc positions in Deep Learning, Control, and Robot Vision

We have several open PhD student and Postdoc positions in Deep Learning, Control, and Robot Vision for Agile, Vision-based Quadrotor Flight. For more info and applications, please see here.


November 11, 2016

Facebook-Oculus VR Zurich (aka Zurich-Eye) is in the main Swiss news!

Zurich-Eye, the Wyss-Zurich project co-founded in Sep. 2015 by former RPG members Christian Forster (author of SVO), Matia Pizzoli (author of REMODE), and Manuel Werlberger, gets featured in the Swiss news.


October 27, 2016

Event-Camera Dataset and Simulator released!

We are happy to announce the release of the first public datasets recorded with an event camera (DAVIS) for pose estimation, visual odometry, and SLAM applications! The data also include intensity images, inertial measurements, ground truth from a motion-capture system, synthetic data, as well as an event camera simulator! We believe that event cameras will allow future robots to move faster and more agilely. Find out more on the dataset website!


October 24, 2016

Fritz-Kutter Award for Industry Related Thesis in Computer Science

Our former student and current Research Assistant Timo Horstschäfer won the Fritz Kutter Award for Industry Related Thesis in Computer Science with his Master Thesis "Parallel Tracking, Depth Estimation, and Image Reconstruction with an Event Camera". This is the second time in two years that an RPG master student wins this prestigious award! Congratulations!


October 14, 2016

IROS 16 Best Application Paper Award Finalist

Our paper Low-Latency Visual Odometry using Event-based Feature Tracks was nominated as Finalist for the Best Application Paper Award at IROS 2016. Also, it was selected as highlight oral talk, with an acceptance rate of 0.4%. Congratulations to Beat Kueng, Elias Mueggler and Guillermo Gallego!


October 14, 2016

IROS'16 workshop organized by Davide Scaramuzza attracts 200 people worldwide!

Our third international workshop on Vision-based High Speed Autonomous Navigation of UAVs, co-organized by Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) featured an impressive line of renowned speakers, live demos, industries, and attracted more than 200 people worldwide!


October 5, 2016

RPG research featured on IEEE Spectrum and Robohub

Our latest work on quadrotor flight through narrow gaps was featured on IEEE Spectrum and Robohub. For more details, check our research page.


October 3, 2016

New Visiting PhD Student

We welcome Rubén Gómez Ojeda from University of Málaga as new visiting PhD student in our lab!


September 27, 2016

Agile Flight through Narrow Gaps!


Check out our latest work on agile quadrotor flight through narrow gaps with onboard sensing and computing: LINK


September 22, 2016

Best BMVC'16 Industry Paper Award!


Our paper EMVS: Event-based Multi-View Stereo, receives BMVC'16 Best Industry Paper Award! Congratulations to Henri Rebecq and Guillermo Gallego!


September 14, 2016

Zurich-Eye now part of Facebook-Oculus VR Zurich!

Zurich-Eye, the Wyss-Zurich project co-founded in Sep. 2015 by former RPG members Christian Forster (author of SVO), Matia Pizzoli (author of REMODE), and Manuel Werlberger, is now part of Oculus VR Zurich. RPG is very proud of them! This highlights the importance and impact of the great work they have done!


September 12, 2016

New paper accepted at BMVC16!


Our paper "EMVS: Event-based Multi-View Stereo" about monocular 3D reconstruction using an event camera has been accepted for oral presentation at BMVC'16!

Check out our publication list.


September 2, 2016

RPG organizes IROS'16 workshop on fast, vision-controlled MAVs

Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the third international workshop on Vision-based High Speed Autonomous Navigation of UAVs


September 1, 2016

New drone engineer!

Welcome to Alessandro Simovic as a new drone engineer in our lab!


September 1, 2016

New visiting Assistant Professor

Welcome to Stefano Ghidoni from the University of Padua as a visiting assistant professor in our lab!


August 19, 2016

Trained in 60 seconds...

RPG and IDSIA have collaborated on a new paper, which will be presented at ISER 2016. We show that we can train a terrain classifier for search and rescue scenarios while our quadrotor is in flight, in only one minute! Results can be seen in this YouTube video, and details can be found in our publication list.

August 10, 2016

Check out our SLAM position paper!

Cesar Cadena, Luca Carlone Henry Carrillo, Yasir Latif, Davide Scaramuzza, Jose Neira, Ian Reid, and John Leonard have co-authored a paper on Simultaneous Localization And Mapping: Present, Future, and the Robust-Perception Age. Check it out!


July 25, 2016

RPG members win reading group competition at ICVSS


Zichao Zhang and Titus Cieslewski were in the reading group that won the reading group competition at the International Computer Vision Summer School. One of the main ingredients for this success was an experimental literature visualization tool developed at the RPG group retreat.


July 20, 2016

Open position as Drone Research Engineer

We have an open position in our team for a Drone Research Engineer. Check out our open positions.


July 14, 2016

New papers accepted for EBCCSP'16 and IROS'16

Check out our publication list.


May 25, 2016

Software Release: Information Gain Based Active Reconstruction Framework

The software package corresponding to the paper An Information Gain Formulation for Active Volumetric 3D Reconstruction is now available from our Github page. This is a general, open-source, framework for volumetric reconstruction that is object, sensor, and robot-agnostic. Some results can be see in this YouTube video.


May 23, 2016

New visiting researcher from Toyota Research Institute Boston

Welcome to Dr. Naveen Kuppuswamy, our new visiting researcher from Toyota Research Institute Boston (MA)!


May 10, 2016

Qualcomm Innovation Fellowship


Elias Mueggler, a PhD student in our lab, won a Qualcomm Innovation Fellowship with his proposal "Event-based Vision for High-Speed Robotics"!


March 7, 2016

Four new papers accepted for RA-L/ICRA'16!

Check out our publication list.


February 12, 2016

RPG receives huge media coverage worldwide!

We received huge media coverage for our research on autonomous drone navigation in the forests using Deep Neural Network; among these, Discovery Channel Canada and NBC News.


February 11, 2016

RPG on IEEE Spectrum and Robohub

Our recent work on autonomous navigation in the forests using Deep Neural Networks makes it to IEEE Spectrum and Robohub.


February 10, 2016

RPG on SRF Tagesschau

Our recent work on autonomous navigation in the forests using Deep Neural Networks makes it to the Swiss National TV News channel: SRF Tagesschau.


February 10, 2016

Our drones use DNNs to learn to recognize and follow forest trails in search for missing people

This research appeared in the IEEE Robotics and Automation Letter, will be presented at the IEEE International Conference on Robotics and Automation (ICRA'16) and is nominated for the best AAAI video Award. Journal paper. More info. YouTube video.


January 15, 2016

Davide Scaramuzza Associate Faculty of the Wyss Zurich

Davide Scaramuzza was appointed Associate Faculty at Wyss Zurich, the new translational center of UZH and ETH Zurich dedicated to regenerative and robotics technologies.


January 14, 2016

One of our students receives a prize from Homegate at HackZurich!

Titus Cieslewski, a PhD student in our lab, has received a prize from Homegate during the HackZurich hackathon, for the project Wonsch. He and his team were among the top 3 of the 20 teams that participated in the Homegate challenge during the hackathon.


January 1, 2016

New PhD student

Welcome to Titus Cieslewski as a new PhD student in our lab!

January 1, 2016

Davide Scaramuzza from Magic to Robotics!

NCCR Robotics interviews Davide Scaramuzza about how his academic path, from working as a magician in theaters and public squares to pay his undergraduate studies to becoming a robotics professor (LINK).


November 4, 2015

RPG at the Swiss Robotics Industry day

NCCR Robotics organized the Swiss Robotics Industry day at EPFL Lausanne. We showed the collaboration of a flying robot with a legged robot from ETH Zurich. Some highlights are shown in a video by Le Matin.


October 31, 2015

Interview with Davide Scaramuzza on Robohub

An interview with Davide Scaramuzza was published on Robots Podcast and Robohub. Check out our media page!


October 2, 2015

RPG organizes IROS'15 workshop on visual navigatoin of MAVs

Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the third international workshop on Vision-based Control and Navigation of Small, Lightweight UAVs at IROS'15.


October 2, 2015

IROS'15 workshop organized by Davide Scaramuzza attracts 200 people worldwide!

Our second international workshop on Vision-based Control and Navigation of Small, Lightweight UAVs, co-organized by Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) featured an impressive line of renowned speakers, live demos, industries, and attracted more than 200 people worldwide!


October 2, 2015

Davide Scaramuzza gives tutorial on event-based vision at IROS'15 workshop

Davide Scaramuzza gives a tutorial on event-based vision at the IROS'15 workshop on Alternative Sensing for Robot Perception: Beyond Laser and Vision. The slides can be downloaded from here.


September 15, 2015

New research assistant

Welcome to Michael Gassner as a new research assistant in our lab!


September 5 and 6, 2015

RPG at Scientifica 2015

The Robotics and Perception Group showed their research at Scientifica, the science fair of ETH and University of Zurich. 25.000 visitors attended the event. Have a look at the gallery!


September 2, 2015

RPG organizes IROS'15 workshop on visual navigation of MAVs

Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the second international workshop on Vision-based Control and Navigation of Small, Lightweight UAVs at IROS'15.


September 1, 2015

RPG members create spinoff Zurich-Eye

Christian Forster (author of SVO), Matia Pizzoli (author of REMODE), and Manuel Werlberger create Zurich-Eye, an spinoff project of Wyss-Zurich dedicated to the commercialization of visual-inertial navigation solutions.


July 13, 2015

RSS'15 Best Paper Award Finalist

Christian Forster's RSS'15 paper is Best Paper Award Finalist at RSS'15!


June 29, 2015

Two new papers accepted for RSS'15, one for ECMR'15!

Check out our publication list.


June 15, 2015

New PhD student

Welcome to Henri Rebecq as a new PhD student in our lab!

May 30, 2015

ICRA'15 workshop on Innovative Sensing for Robotics

Andrea Censi (MIT) and Davide Scaramuzza organized a workshop on Innovative Sensing for Robotics at ICRA'15!


April 14, 2015

RPG research featured on IEEE Spectrum

Our latest work on failure recovery from agressive flight IEEE Spectrum. For more details, see the ICRA'15 paper and the accompanying video.


April 14, 2015

New PhD student

Welcome to Davide Falanga as a new PhD student in our lab!

March 30, 2015

Three new papers accepted for ICRA'15!

Check out our publication list.


March 20, 2015

RPG at CeBIT

RPG showcased its autonomous quadrotors and live 3D reconstruction at this year's CeBIT, the world's largest computer expo! See some pictures in our gallery.


March 15, 2015

3 new journal papers accepted!

Check out our publication list.


February 1, 2015

RPG celebrates 3-year anniversary!

In this clip, we summarize our main achievements, projects, awards, exhibitions, and upcoming videos! Watch our YouTube video!


November 21, 2014

Davide Scaramuzza wins ERC Starting Grant (1.5 million EUR)

Davide Scaramuzza wins ERC Starting Grant, through the Swiss National Science Foundation.


October 29, 2014

Best Paper Award Finalists

Our SSRR'14 paper on "Aerial-guided Navigation of a Ground Robot among Movable Obstacles" was selected as Finalist for the Best Paper Award. Our ICRA'14 paper "REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time" was nominated as Finalist for the NCCR Best PostDoc Paper Award.


October 21, 2014

RPG Master student wins Fritz Kutter Award!

Our former Master student Basil Huber won the 2014 Fritz Kutter Award for Industry Related Thesis in Computer Science. His thesis was on High-Speed Pose Estimation using a Dynamic Vision Sensor. Congratulations!


October 7, 2014

RPG research featured on IEEE Spectrum

Our latest work on event-based vision was featured on IEEE Spectrum. For more details, see the IROS'14 paper and the accompanying video.


October 1, 2014

New Postdoc and visiting students

We welcome Dr. Manuel Werlberger as new Postdoc in our lab! We are also happy to host Junije Zhang and Zichao Zhang as visiting PhD students.


September 9, 2014

New paper accepted for SSRR'14!

Check out our publication list.


September 6, 2014

Attend our live quadrotor demo at ECCV workshop on Sep. 6

We will demonstrate autonomous, vision-based flight and live dense 3D mapping with a quadrotor MAV at the ECCV workshop on Computer Vision in Vehicle Technology on Sep. 6 at 5:30pm. Watch here the video preview.


August 1, 2014

New Postdocs and visiting students

We welcome Dr. Guillermo Gallego and Dr. Jeff Delmerico as new Postdocs in our lab! We are also happy to host Antonio Toma, Gabriele Costante, Nathaly Gasparin, Ra'Eesah Mangera, Kumar Shaurya Shankar and Xin Yu as visiting students this year.


June 17, 2014

RPG gets Google Tango

Many thanks to Google Tango!


June 4, 2014

Davide Scaramuzza wins IEEE Robotics and Automation Early Career Award!


Davide Scaramuzza wins the 2014 IEEE Robotics and Automation Society Early Career Award "for his major contributions to robot vision and visually-guided micro aerial vehicles".


June 4, 2014

RPG wins KUKA Innovation Award!


The Robotics and Perception Group wins the KUKA Innovation Award (20.000 EUR) with its demonstration of collaboration of flying and ground robots for search-and-rescue missions. Watch a video of the demo at AUTOMATICA.


June 2, 2014

Software Release: SVO - Semi-Direct Visual Odometry

The software corresponding to the paper SVO: Fast Semi-direct Monocular Visual Odometry can now be downloaded from our Github page. The source code is released under a GPLv3 licence. A professional edition license for closed-source projects is also available.


June 1, 2014

RPG appears among the European robotics success stories

RPG appears among the best 12 European robotics success stories advertised by EU commission. Read the full article.


May 30, 2014

RPG research featured on MIT News

Our latest work on event-based vision was featured in the MIT News. For more details, see the ICRA'14 paper.


May 17, 2014

Two new papers accepted for RSS'14 and IROS'14!

Check out our publication list.


February 18, 2014

Davide Scaramuzza wins Google Faculty Research Award!

Many thanks to Google!


January 14, 2014

Eight new papers accepted for ICRA'14!

Check out our publication list.


December 2, 2013

Amazon Prime Air visits RPG

Daniel Buchmüller, co-founder and software engineer at Amazon Prime Air, visited us.

December 2, 2013

RPG featured on SRF 10vor10

The Robotics and Perception Group was featured in the news programme 10vor10 of the Swiss National TV (SRF). Check the video!

November 26, 2013

KUKA Best Student Project Award 2013!

Benjamin Keiser won the KUKA Best Student Project Award 2013 with his Master thesis Torque Control of a KUKA youBot Arm that he did with the Robotics and Perception Group. A demonstration of the capabilities of his controller is shown in this video.

November 20, 2013

RPG featured on arte X:enius

The Robotics and Perception Group was featured in a documentary on the German-French TV channel ARTE. Check both the German and French version!

November 19, 2013

Henri Seydoux, CEO of Parrot, visits RPG

Henri Seydoux, CEO and founder of Parrot, the company making the popular toy quadrocopter AR.Drone, visited us.

November 6, 2013

Open position in Embedded Computer Vision

Open position at the Robotics and Perception Group. Check here.

November 5, 2013

RPG organizes IROS'13 Workshop on MAVs

We are organizing the first international workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments, featuring amazing live flight demonstrations!

November 3, 2013

RPG presents four papers at IROS'13

We will be presenting papers in sessions Localization II (Nov. 4) and Unmanned Aerial Vehicles IV (Nov. 5).

August 31, 2013

RPG was featured in a report about Scientifica 2013 on Radio SRF 1

The report can be found here.

August 1, 2013

New Postdoc

We welcome Sergei Lupashin as a new Postdoc in our lab!

July 31, 2013

Five new papers accepted for IROS'13 and ECMR'13!

Check out our publication list.

June 1, 2013

New Postdoc

We welcome Reza Sabzevari as a new Postdoc in our lab!

May 4, 2013

Join RPG at ICRA'13

RPG will give two talks at ICRA'13: one by Christian Forster (ThDInt.14) and one by Davide Scaramuzza (link)

April 29, 2013

A Documentary about RPG in the Swiss TV

A documentary by the Swiss TV about our autonomous helicopters for search and rescue(link to original article)

April 20, 2013

RPG participates at the Swiss Robotics Festival

The Swiss Robotics Festival is the largest Swiss Robotics exhibition, which this year attracted more than 20,000 participants. What some impressions from our demonstrations! (link)

April 1, 2013

New PhD student

Welcome to Flavio Fontana as a new PhD student in our lab!

March 31, 2013

RPG featured in the IEEE Spectrum News and Gizmodo

Our Easter video featuring a ground and an aerial robot was featured in IEEE Spectrum News and Gizmodo, the famous technology review blogs. Read the articles: IEEE Spectrum and Gizmodo.

February 28, 2013

RPG featured in the Weltwoche magazine

Weltwoche, a Swiss weekly magazine, talks about drones in the daily life. Read the article: "Der Spion von deinem Fenster" (German only).

December 1, 2012

Three new visiting PhD Students

Welcome to Chiara Troiani from INRIA, Volker Grabe from Max Planck Institute, and Damiano Verda from University of Genoa, who will join us for six months.

December 1, 2012

Two new PhD Students

Welcome to Matthias Faessler and Elias Mueggler as new PhD students in our lab!

December 19, 2012

Student Projects Available for Spring 2013

UZH and ETH Students are welcome to apply for a project at our lab. — More information

November 1, 2013

Four new members

We welcome Dr. Matia Pizzoli and Dr. Andras Majdik, our two new postdocs, and to Yanhua Jiang and Volker Grabe, our new visiting PhD students!

September 16, 2013

New visiting postdoc

Welcome to Dr. Andrea Censi, our new visiting postdoc for three months!

September 1, 2012

Davide Scaramuzza at TEDxZurich

Prof. Davide Scaramuzza has been invited to give a talk at TEDxZurich on October 25, 2012.

July 7, 2012

RSS Workshop

Join the talk about Christian Forster's work on Collaborative Visual SLAM with Multiple MAVs at the RSS workshop on integration of perception with control and navigation for resource-limited, highly dynamic, autonomous systems.

May 17, 2012

Davide Scaramuzza wins European Young Researcher Award 2012

Congratulations! Download the press release.

May 1, 2012

New PhD Student

We welcome Christian Forster as new PhD student in our lab!

April 23, 2012

sFly on IEEE News

The European project sFly, coordinated by Davide Scaramuzza, gets a log of media attention and gets featured on IEEE News. Read it here.

Past Video Highlights

June 11, 2020

Watch our drone flying very agile acrobatics maneuvers! Read our Deep Drone Acrobatics paper for further details.

May 25, 2020

Watch our performance scoring second at the AlphaPilot Challenge! Read our RSS 2020 paper for further details.

March 18, 2020

Watch our drone play dodgeball using an event camera! Read our Science Robotics paper for further details.

Jan 14, 2020

Watch how an event camera is used to reconstruct video at arbitrary frame rate and thus observe fast phenomena. Video reconstruction is done using a recurrent neural network trained only in sumulation! Read our T-PAMI paper for further details.

Oct 7, 2019

Watch how we taught a drone to race autonomously a track which it never saw before! The system runs fully onboard and is powered by a neural network trained on a non-photorealistic simulator, which was deployed on the real drone without any fine tuning! Read our T-RO paper for further details.

Oct 1, 2019

Watch this 27 gram nano drone avoid obstacles using a neural netowrk running on a 63mW Parallel Ultra Low Ppower Processor! Read our IEEE IoT journal paper for further details.

May 7, 2019

Watch an autonmous drone dodge a ball thrown at it at 10m/s. An event-based camera is used to detect the ball with millisecond latency. Read the paper for further details.

December 13, 2018

We present the first foldable drone that can guarantee stable flight with any configuration. It can squeeze to fly through narrow gaps. Read the paper for further details.

December 6, 2018

We release the code of Perception-Aware Model Predictive Control (PAMPC). PAMPC allows drones to navigate trajectory while keeping the visibility of a point of interest (a gate, a gap, texture). Paper.

October 16, 2018

We combined deep networks, local VIO, Kalman filtering, and optimal control to achieve ultimate speed for autonomous drone racing. Paper describing the approach.

October 3, 2018

Our performance at the IROS'18 Autonomous Drone Race Competition, where we won the 1st place passing all 8 gates in just 30 seconds and outracing the 2nd placing team by a factor of 2! We combined deep networks, local VIO, Kalman filtering, and optimal control. Paper describing the approach.

June 18, 2018

Event cameras allow predicting the steering angle of a car more robustly and accurately at night and high dynamic range scenes than a standard camera. Paper.

March 23, 2018

Watch the first ever autonomous quadrotor flight with an event camera using our UltimateSLAM. RAL'18 paper.

March 22, 2018

UltimateSLAM combines, images events, and IMU to achieve the ultimate visual SLAM performance: up to 85% accuracy improvement over VIO with standard cameras! Paper. Project webpage

January 23, 2018

DroNet is a Deep Neural Network architecture that makes drones able to fly autonomously and safely in the streets of a city, among other vehicles, by imitating the behavior of cars and bicycles! Video, Paper, Datasets.

May 19, 2017

Check out our latest work on active exposure control for robust visual odometry in high dynamic range environments: ICRA'17 paper.

May 5, 2017

We are awarded the 2017 Misha Mahowald Prize, which recognizes outstanding achievement in the field of neuromorphic engineering. Press release.

April 4, 2017

We release the first public, large-scale dataset recorded with a drone in an urban environment at low altitudes (5-15m). Dataset here.

December 20, 2016

Check out our latest work, EVO, on Event-based, 6-DOF Parallel Tracking and Mapping in Real-time: RA-L'16 paper.

December 20, 2016

Check out our latest work on rotational motion estimation with an Event Camera: RA-L'16 paper.

September 27, 2016

Check out our latest work on agile quadrotor flight through narrow gaps with onboard sensing and computing: More info here.

September 12, 2016

Check out our latest work on Event-based Multi-View Stereo, which uses a single, continuously moving event camera for accurate 3D reconstruction! BMVC'16 paper.

August 19, 2016

Our latest work on search and rescue robotics is a system for training a terrain classifier "on-the-spot" in only 60 seconds. Our flying robot can then use this classifier to guide a ground robot through a disaster area. Details are in our ISER'16 paper.

July 19, 2016

We designed an event-based 6-DOF pose tracking pipeline with the latency of 1 microsecond using the DVS sensor for very high speed (>500 deg/sec) and high-dynamic-range (> 130 dB) applications, where all standard cameras fail. All the details in our Arxiv paper.

July 19, 2016

We designed an event-based 6-DOF visual odometry pipeline with the latency of 1 microsecond using the DAVIS sensor. All the details in our IROS'16 paper and EBCCSP'16 paper.

February 10, 2016

We used Deep Neural Networks to teach our drones to recognize and follow forest trails to search for missing people. Journal Paper. More info.

May 25, 2016

Our active volumetric reconstruction software framework is now released open source. More details in our ICRA'16 paper.

March 30, 2015

Our latest work on failure recovery from aggressive flight and how to launch a quadrotor by throwing it in the air! ICRA'15 paper.

March 30, 2015

Our latest work on autonomous landing-site detection and landing with onboard monocular vision! ICRA'15 paper.

February 1, 2015

To celebrate our lab's 3-year anniversary, we summarize in this clip our main achievements, projects, awards, exhibitions, and upcoming videos!

October 24, 2014

Our latest work on Aerial-guided Navigation of a Ground Robot among Movable Obstacles. More details in our SSRR'14 paper.

October 7, 2014

Our latest work on Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles. More details in our RSS'14 paper.

September 15, 2014

Our latest work on event-based vision: 6-DOF Pose Tracking for High-Speed Maneuvers. More details in our IROS'14 paper.

September 6, 2014

Our quadrotor demo trailer: autonomous navigation, live dense 3D reconstruction, and collaborative grasping.

June 6, 2014

Our demo at the KUKA Innovation Award that shows the collaboration of flying and ground robots for search-and-rescue missions

February 25, 2014

Autonomous Vision-based Flight over a Disaster Zone using SVO (more details)

February 19, 2014

SVO - our new visual odometry pipeline for MAV state estimation. More details in our ICRA'14 paper.

February 19, 2014

Our latest work on probabilistic, monocular dense reconstruction in real time. More details in our ICRA'14 paper.

February 19, 2014

Our monocular pose estimation system that is released as open-source. More details in our ICRA'14 paper.

November 26, 2013

Torque Control of a KUKA youBot Arm (Master thesis of Benjamin Keiser)

November 20, 2013

RPG was featured on the German-French TV channel ARTE in their science programme X:enius. The French version is available here.

August 7, 2013

Watch the video for our new IROS'13 paper "Air-Ground Localization and Map Augmentation Using Monocular Dense Reconstruction".

August 7, 2013

Check out our new IROS'13 paper "MAV Urban Localization from Google Street View Data".

August 7, 2013

Watch the video for our new IROS'13 paper "Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles".

October 25, 2012

Autonomous Vision-Controlled Micro Flying Robots: Davide Scaramuzza at TEDxZurich.