December 29, 2022
It's an honor to be featured in the top 10 robotics stories of 2022 by IEEE Spectrum! Kudos and congratulations to our team that made this possible!
December 27, 2022
We won the NCCR Robotics Most Impactful Paper Award with the paper "A Machine Learning Approach to Visual Perception of Forest Trails for Mobile Robots". Congrats to Alessandro Giusti and his co-authors!
December 24, 2022
After 12 amazing years, NCCR Robotics, the Swiss National Competence of Research in Robotics, has come to an end. I’m very proud to have been part of this! This RoboHub article summarizes all the key achievements, from assistive technologies that allowed patients with completely paralyzed legs to walk again, to winning the DARPA SubT Challenge, to legged and flying robots with self-learning capabilities for use in disaster mitigation as well as in civil and industrial inspection, to robotic startups that have become world leaders, to creating Cybathlon, the world-first Olympic-style competition for athletes with disabilities supported by assistive devices, to educational robots, such as Thymio, that have been used by thousands of children around the world. Congrats to all NCCR Robotics members who have made this possible! NCCR Robotics will continue to operate in four different projects. Check out this article to learn more: link.
December 16, 2022
We present the first survey on visual SLAM for visually impaired people. This technology has tremendous potential to assist people and it will be used, for the first time, in the next Cybathlon competition where we participate. For more information, have a look at our paper and the Cybathlon website.
December 1, 2022
This week, we celebrate the 10th anniversary of RPG! This video celebrates our anniversary, the over 300 people who worked in our lab as Bsc/Msc/Ph.D. students, postdocs, visiting researchers, all our collaborators, our research sponsors, and the administration people at our university. We thank all of them for contributing to our research. And thank you as well for following our research. The lab made important contributions to autonomous, agile vision-based navigation of micro aerial vehicles and event cameras for mobile robotics and computer vision. Three startups and entrepreneurial projects came out of the lab: the first one, Zurich Eye, became Facebook-Meta Zurich, which contributed to the development of the VR headset Oculus Quest; the second one, Fotokite, makes tethered drones for first responders; the third one, SUIND, makes vision-based drones for precision agriculture. Our researchers won over 50 awards and many paper awards, have published more than 100 scientific articles, which have been cited more than 35 thousand times, and have been featured in many media, including The New York Times, Forbes, and The Economist (media page). We have also released more than 85 open-source software packages, datasets, and toolboxes to further accelerate science advancement and our research's reproducibility (software page). Our algorithms have inspired and have been transferred to many products and companies, including NASA, DJI, Bosch, Nikon, Magic Leap, Meta-Facebook, Huawei, Sony, and Hilti. Thank you for making all this possible! Video.
November 30, 2022
Can you guess who wrote a paper, just by reading it? We present a transformer-based AI that achieves over 70% accuracy on the newly created, largest-to-date, authorship-attribution dataset with over 2000 authors. For more information check out our paper and open-source code.
November 23, 2022
We introduce various design principles that push the limits of asynchronous graph-based object detection from events by allowing us to design deeper, more powerful models, whithout sacrificing efficiency. While our smallest such model outperforms the best asynchronous methods by 7.4 mAP with 3.7 higher efficiency, our largest model even outperforms dense, feedforward methods, a feat previously unattained by asynchronous methods. For more information, check out our paper.
November 7, 2022
In the recent NZZ format documentary on military drones, our lab is featured in its role as a civil research institution working on possible dual-use technology. Our search-and-rescue technology is shown to underline the huge potential of drones to be used in critical missions, possibly saving many lives. Link
November 7, 2022
Our autonomous vision-based drones are features in the SRF Tagesschau (05.11.2022) report on the NCCR Swiss Robotics Day in Lausanne. We demonstrate how the technology we develop can be used in GPS-denied environments that are commonly encountered in, for example, search-and-rescue scenarios. YouTube [DE], YouTube [IT], SRF [DE], RSI [IT]
October 28, 2022
The Robotics and Perception Group participated in the parabolic flight campain of UZH Space Hub to study how gravity affects the decision-making of human drone pilots.
October 27, 2022
October 14, 2022
We released the code and datasets for our work "Data-Efficient Collaborative Decentralized Thermal-Inertial Odometry" with NASA JPL, extending the already-public JPL xVIO library. With this work, we unleash collaborative drone swarms in the dark, opening new challenging scenarios for the robotics community. For more details, visit the project page.
October 4, 2022
October 4, 2022
September 16, 2022
Congratulations to our former Master student Michelle Ruegg for winning the NCCR Robotics Master Thesis Award for her thesis on combining frames and events for asynchronous multi-modal monocular depth prediction! The thesis was supervised by Daniel Gehrig and Mathias Gehrig.
September 6, 2022
September 1, 2022
We warmly welcome Nikola Zubić as a new research assistant in our lab!
August 26, 2022
August 26, 2022
August 2, 2022
We are excited to announce our ECCV paper, which overcomes the lack of semantic segmentation datasets for event cameras by directly transferring the semantic segmentation task from existing labeled image datasets to unlabeled events. Our approach neither requires video data nor per-pixel alignment between images and events. For more details, check out the paper, video, code, and dataset.
August 1, 2022
We warmly welcome Vincenzo Polizzi as a new research assistant in our lab!
July 31, 2022
July 29, 2022
July 21, 2022
July 13, 2022
July 7, 2022
On June 10-11, we organized the first race between an AI-powered vision-based drone vs human pilots. We invited two world champions and the Swiss champion. Read this report by Evan Ackerman from IEEE Spectrum, who witnessed the historic event in person.
July 6, 2022
We are releasing UltimateSLAM, which combines events, frames, IMU to achieve the ultimate slam performance in high speed and high dynamic range scenarios. Paper Code Video Project Webpage
July 5, 2022
Do not miss our IROS2022 Workshop: Agile Robotics: Perception, Learning, Planning, and Control! Checkout the agenda and join the presentations at our workshop website. Organized by Giuseppe Loianno, Davide Scaramuzza, Shaojie Shen.
July 4, 2022
July 1, 2022
June 17, 2022
We are excited to announce that our paper on A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight was accepted at T-RO 2022. Our work empirically compares two state-of-the-art control frameworks: the nonlinear-model-predictive controller (NMPC) and the differential-flatness-based controller (DFBC), by tracking a wide variety of agile trajectories at speeds up to 72km/h. Read our A Comparative Study of Nonlinear MPC and Differential-Flatness-Based Control for Quadrotor Agile Flight for further details.
June 16, 2022
We release the Hilti SLAM Challenge
Dataset!
The sensor platform used to collect this dataset contains a number of visual, lidar and
inertial sensors which have all been rigorously calibrated. All data is temporally aligned
to support precise multi-sensor fusion. Each dataset includes accurate ground truth to allow
direct testing of SLAM results. Raw data as well as intrinsic and extrinsic sensor
calibration data from twelve datasets in various environments is provided. Each environment
represents common scenarios found in building construction sites in various stages of
completion.
For more details, check out the paper, video and talk.
June 13, 2022
June 3, 2022
June 1, 2022
We welcome Drew Hanover and Chao Ni as new PhD students in our lab!
May 27, 2022
We are honored that our IEEE Robotics and Automation Letters paper "Autonomous Quadrotor Flight Despite Rotor Failure With Onboard Vision Sensors: Frames vs. Events" was selected for the Best Paper Award. Congratulations to all collaborators!
May 20, 2022
We are looking forward to presenting these 9 papers on perception, learning, planning, and control in person next week at IEEE RAS ICRA! Additionally, we will be presenting in many workshops. A full list with links, times, and rooms can be found here
May 5, 2022
The University of Zurich celebrated its 189th birthday. During the celebrations rector Prof. Michael Schaepman names drones flying faster than humans as a testbed for AI research and search and rescue operations to be one of three key findings of UZH in 2021. A video of the speech can be found here (at 26:00 he starts to talk about drones).
May 4, 2022
We are excited to announce that our paper on Model Predictive Contouring Control for Time-Optimal Quadrotor Flight was accepted at T-RO 2022. Thanks to our Model Predictive Contouring Control, the problem of flying through multiple waypoints in minimum time can now be solved in real-time. Read our Model Predictive Contouring Control for Time-Optimal Quadrotor Flight paper for further details.
May 2, 2022
We welcome Dr. Marco Cannici as a new postdoc in our lab!
April 28, 2022
April 21, 2022
April 21, 2022
March 31, 2022
March 29, 2022
March 29, 2022
March 17, 2022
March 14, 2022
March 10, 2022
March 1, 2022
February 28, 2022
Planning minimum-time trajectories for quadrotors in the presence of obstacles was, so far, unaddressed by the robotics community. We propose a novel method to plan such trajectories in cluttered environments using a hierarchical, sampling-based method with an incrementally more complex quadrotor model. The proposed method is shown to outperform all related baselines in cluttered environments and is further validated in real-world flights at over 60km/h. Check our paper, video and code.
February 17, 2022
In this work, we systematically compare the advantages and limitations of the discrete and continuous
vision-based SLAM formulations.
We perform an extensive experimental analysis, varying robot type, speed of motion, and sensor modalities.
Our experimental analysis suggests that, independently of the trajectory type, continuous-time SLAM is
superior to its discrete counterpart whenever the sensors are not time-synchronized. For more details,
check out paper and code.
February 15, 2022
Multirotor aerial robots are becoming widely used for the inspection of powerlines. To enable continuous, robust inspection without human intervention, the robots must be able to perch on the powerlines to recharge their batteries. This paper presents a novel perching trajectory generation framework that computes perception-aware, collision-free, and dynamically-feasible maneuvers to guide the robot to the desired final state. For more details, check out the paper and video. The developed code is available online at code
February 9, 2022
The mechanical simplicity, hover capabilities, and high agility of quadrotors lead to a fast adaption in
the industry for inspection, exploration, and urban aerial mobility. On the other hand, the unstable and
underactuated dynamics of quadrotors render them highly susceptible to system faults, especially rotor
failures. In this work, we propose a fault-tolerant controller using nonlinear model predictive control
(NMPC) to stabilize and control a quadrotor subjected to the complete failure of a single rotor. Check our
paper and video.
February 4, 2022
We are delighted to announce the standing leader board of the UZH-FPV drone racing dataset.
Participants submit the results of their VIO algorithms and receive the evaluation in few minutes thanks
to our automatic code evaluation.
For more details, check out the website!
We look forward to receiving your submissions to advance the state-of-the-art of VIO in high speed state
estimation.
February 2, 2022
To overcome the shortage of event-based datasets, we propose a task transfer method that allows models to be trained directly with labeled images and unlabeled event data. Our method transfers from single images to events and does not rely on paired sensor data. Thus, our approach unlocks the vast amount of image datasets for the training of event-based neural networks. For more details, check out the paper, video, and code.
January 31, 2022
Tired of tuning your controllers by hand? Check out our RAL22 paper "AutoTune: Controller Tuning for High
Speed Flight". We propose a gradient-free method based on Metropolis-Hastings Sampling to automatically
find parameters to maximize the performance of a controller during high speed. We outperform both existing
methods and human experts! Check paper, video, and code.
January 28, 2022
Excited to see our research on event cameras featured in The Economist! Check it out!
January 10, 2022
Our press release on time optimal trajectory planning from July 2021 made it to the top 10 most
successful media releases of UZH in 2021, just following the media release on the Alzheimer's FDA approved
drug! Check it
out!
January 10, 2022
We propose E-RAFT, a novel method to estimate dense optical flow from events only, alongside DSEC-Flow, an extension of DSEC for optical flow estimation. Download the datasets and submit to the DSEC-Flow benchmark that automatically evaluates your submission. For more details, check out the paper, video, and project webpage. Our code is available on GitHub.
December 20, 2021
Congratulations to Philipp Foehn, who has successfully defended his PhD dissertation titled "Agile Aerial Autonomy: Planning and control", on December 14, 2021. We thank the reviewers: Prof. Moritz Diehl, Prof. Luca Carlone, and Prof. Roland Siegwart!
The full video of the PhD defense presentation is on YouTube.December 15, 2021
We propose a novel method to merge reinforcement learning and model predictive control. Our approach enables a quadrotor to fly through dynamic gates. The paper has been accepted for publication in the IEEE Transactions on Robotics (T-RO), 2022. Checkout our paper and the code
December 9, 2021
We release the code of our ICRA 2019 paper Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization. The code is implemented in C++ and runs in real-time on a laptop. Try it out for yourself on GitHub!
December 8, 2021
November 1, 2021
The goal of the Tartan SLAM Series is to expand the understanding of those both new and experienced with SLAM. Sessions include research talks, as well as introductions to various themes of SLAM and thought provoking open-ended discussions. The lineup of events aim to foster fun, provocative discussions on robotics. In his talk, Davide Scaramuzza speaks about the main progresses of our lab in SLAM over the past years. He also introduces event-cameras and speaks about their potential applications in visual SLAM. Check out the slides and the video on Youtube!
October 21, 2021
We are excited to release fully open source SVO Pro! SVO Pro is the latest version of SVO developed over the past few years in our lab. SVO Pro features the support of different camera models, active exposure control, a sliding window based backend, and global bundle adjustment with loop closure. Check out the project page and the code on github!
October 20, 2021
We present an efficient bio-inspired event-camera-driven depth sensing algorithm. Instead of uniformly sensing the depth of the scene, we dynamically illuminate areas of interest densely, depending on the scene activity detected by the event camera, and sparsely illuminate areas in the field of view with no motion. We show that, in natural scenes like autonomous driving and indoor environments, moving edges correspond to less than 10% of the scene on average. Thus our setup requires the sensor to scan only 10% of the scene, which could lead to almost 90% less power consumption by the illumination source. For more details, check out the paper and video.
October 20, 2021
We have three fully-funded openings for PhD students and Postdocs in computer vision and machine learning to contribute to the areas of:
October 10, 2021
Check out the interview from the Swiss Italian TV LA1 on our research on drone racing and high-speed navigation. We explain why high-speed drones could make a difference in the future of search and rescue operations. In Italian with English subtitles!
October 6, 2021
We are excited to share our latest Science
Robotics paper, done in collaboration with Intel!
An end-to-end policy trained in simulation flies vision-based drones in the wild at up to 40
kph!
In contrast to classic methods, our approach uses a CNN to directly map images to
collision-free trajectories.
This approach radically reduces latency and sensitivity to sensor noise, enabling high-speed
flight.
The end-to-end policy has taken our drones on many adventures in Switzerland!
Check out the video on youtube! We also release
the code and datasets on github!
October 1, 2021
We are excited to release the code
accompanying our latest Science Robotics
paper on time-optimal quadrotor trajectories!
This provides an example implementation of our novel
progress-based formulation to generate time-optimal trajectories
through multiple waypoints while exploiting, but not violating
the quadrotor's actuation constraints.
Check out our real-world
agile flight footage with explanations and find the
details in the paper
on Science Robotics, and find the code on
github.
October 1, 2021
Do not miss our IROS2021 Workshop: Integrated Perception, Learning, and Control for Agile Super Vehicles! Checkout the agenda and join the presentations at our workshop website. Organized by Giuseppe Loianno, Davide Scaramuzza, Sertac Karaman.
The workshop is today, October the 1st, and starts at 3pm Zurich time (GMT+2).
October 1, 2021
We release the Hilti SLAM Challenge
Dataset!
The sensor platform used to collect this dataset contains a number of visual, lidar and
inertial sensors which have all been rigorously calibrated. All data is temporally aligned
to support precise multi-sensor fusion. Each dataset includes accurate ground truth to allow
direct testing of SLAM results. Raw data as well as intrinsic and extrinsic sensor
calibration data from twelve datasets in various environments is provided. Each environment
represents common scenarios found in building construction sites in various stages of
completion.
For more details, check out the paper and video.
September 26, 2021
Our work on controlling a quadrotor after motor failure with only onboard vision sensors, paper, is the winner of the Aerospace and Defense category
in the 2021 Tech Briefs "Create the Future" contest out of over 700 participants worldwide! Watch the
announcement of all the
winners and finalists here.
September 15, 2021
September 13, 2021
Honored that our IEEE Transactions on Robotics 2020 paper "Deep Drone Racing: From Simulation to Reality with Domain Randomization" was selected Best Paper Award finalist! Congratulations to all collaborators for this great achievement! PDF YouTube 1 YouTube 2 Code
September 13, 2021
September 10, 2021
September 9, 2021
September 8, 2021
September 2, 2021
RPG and HILTI are organizing the IROS2021 HILTI SLAM Challenge! Participants can win up to $10,000 prize money and a keynote IROS workshop invitation! Instructions here. The HILTI SLAM Challenge dataset is a real-life, multi-sensor dataset with accurate ground truth to advance the state of the art in highly accurate state estimation in challenging environments. Participants will be ranked by the completeness of their trajectories and by the achieved accuracy. HILTI is a multinational company that offers premium products and services for professionals on construction sites around the globe. Behind this vast catalog is a global team comprising of 30.000 team members from 133 different nationalities located in more than 120 countries.
August 29, 2021
We propose a novel method to estimate dense optical flow from events only, alongside an extension of DSEC for optical flow estimation. Our approach takes inspiration from frame-based methods and outperforms previous event-based approaches with up to 66% EPE reduction. For more details, check out the paper and video.
August 20, 2021
We propose a method that uses event cameras to robustly track lines and show an application for powerline tracking. Our method identifies lines in the stream of events by detecting planes in the spatio-temporal signal, and tracks them through time. For more details, check out the paper and video. We release the code fully open source.
August 17, 2021
The series Real Roboticist, produced by the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), shows the people at the forefront of robotics research from a more personal perspective. In his talk, Davide Scaramuzza explains his journey from Electronics Engineering to leading a top robotics vision research group developing a promising technology: event cameras. He also speaks about the challenges he faced along the way, and even how he combines the robotics research with another of his passions, magic. Read the article and watch the talk. Enjoy!
August 6, 2021
CARLA is the world leading simulator for autonomous driving, developed by Intel.
Our lab contributed to the implementation of the optical flow camera,
requested by the community
since the inception the simulator.
Check out the release video for a short
teaser and the documention
for more information on how to use it.
July 21, 2021
We are excited to announce our latest work on agile flight
allowing us to generate "time-optimal quadrotor trajectories",
which are faster than human drone racing pilots!
Our novel algorithm published in Science Robotics uses a
progress-based formulation to generate time-optimal trajectories
through multiple waypoints while exploiting, but not violating
the quadrotor's actuator constraints.
Check out our real-world
agile flight footage with explanations and find the
details in the paper
on Science Robotics.
June 30, 2021
We are excited to announce our new, indoor, drone-testing arena! Equipped with a real-time motion-capture system consisting of 36 Vicon cameras, and with a flight space of over 30x30x8 meters (7,000 cubic meters), this large research infrastructure allows us to deploy our most advanced perception, learning, planning, and control algorithms to push vision-based agile drones to speeds over 60 km/h and accelerations over 5g. It also allows us to fly in an unlimited number of virtual environments using hardware-in-the-loop simulation. Among the many projects we are currently working on, we aim to beat the best professional human pilot in a drone race. Turn up the volume and enjoy the video! And stay tuned... the best is about to come.. very soon!
June 30, 2021
We release EVO, an Event-based Visual Odometry algorithm from our RA-L paper EVO: Event-based, 6-DOF Parallel Tracking and Mapping in Real-Time. The code is implemented in C++ and runs in real-time on a laptop. Try it out for yourself on GitHub!
June 25, 2021
We are happy to announce the release of the full dataset associated with our upcoming RSS paper NeuroBEM: Hybrid Aerodynamic Quadrotor Model. The dataset features over 1h15min of highly aggressive maneuvers recorded at high accuracy in one of the worlds largest optical tracking volumes. We provide time-aligned quadrotor state and motor-commands recorded at 400Hz in a curated dataset. For more details, check out our paper, dataset and video.
June 25, 2021
Our work on GPU-optmized feature detection and tracking is now available as a simple ROS node. It implements GPU-optimized Fast, Harris, and Shi-Tomasi detectors and KLT tracking, running at hundreds of FPS on a Jetson TX2. For more details, check out our paper Faster than FAST and code.
June 11, 2021
TimeLens is a new event-based video frame interpolation method that generates high speed video from low framerate RGB frames and asynchronous events. Learn more about TimeLens over at our project page where you can find code, datasets and more! We also release a High-Speed Event and RGB dataset which features complex scenarios like bursting balloons and spinning objects!
June 10, 2021
On June 4, 2021, Antonio Loquercio (RPG), Davide Scaramuzza (RPG), Luca Carlone (MIT), and Markus Ryll (TUM) organized the 1st International Workshop on Perception and Action in Dynamic Environments at ICRA.
May 18, 2021
Do not miss our #ICRA2021 workshop on Perception and Action in Dynamic Environments! Checkout the agenda and join the presentations at our workshop website. Organized by Antonio Loquercio, Davide Scaramuzza, Markus Ryll, Luca Carlone.
The workshop is on June the 4th and starts at 4pm Zurich time (GMT+2).
May 18, 2021
We are delighted to announce our CVPR event-based vision workshop competition on disparity/depth prediction on the new DSEC dataset. Visit our website for more details about the competition. Submission deadline is the 11th of June.
May 18, 2021
Congratulations to our lab director, Davide Scaramuzza, for being listed among the 100 most influential robotics scholar by Aminer [ Link ].
May 11, 2021
Congratulations to Antonio Loquercio, who has successfully defended his PhD dissertation titled "Agile Autonomy: Learning Tightly-Coupled Perception-Action for High-Speed Quadrotor Flight in the Wild", on May. 10, 2021. We thank the reviewers: Prof. Pieter Abbeel, Prof. Angela Schoellig and Prof. Roland Siegwart!
The full video of the PhD defense presentation is on YouTube.May 10, 2021
Our paper Deep Drone Racing: from Simulation to Reality with Domain Randomization wins the prestigious IEEE Transactions on Robotics Best Paper Award Honorable Mention: PDF YouTube 1 YouTube 2 Code
May 7, 2021
We propose a generic event camera calibration frame-work using image reconstruction. Check out our Code and PDF
April 30, 2021
We have organized a challenge to push current state of the art for agile navigation in dynamic environments. In this challenge, drones will have to avoid moving boulders while flying in a forest! Deadline for submission is June the 1st! The winner will be awarded with a Skydio2! Partecipate now at https://uzh-rpg.github.io/PADE-ICRA2021/ddc/!
April 26, 2021
Our research inspired the design of the vision-based navigation technology behind the Ingenuity helicopter that flew on Mars. Read the full article on SwissInfo [ English ], [ Italian ].
April 23, 2021
Our lab is collaborating with NASA/JPL to investigate event cameras for the next Mars helicopter missions! Read full interview on SwissInfo with Davide Scaramuzza [ Link ].
April 23, 2021
Davide Scaramuzza talks about "Autonomous, Agile Micro Drones: Perception, Learning, and Control" at GRASP on Robotics seminar series organized by the GRASP laboratory at University of Pennsylvania. In this talk, he shows how the combination of both model-based and machine learning methods united with the power of new, low-latency sensors, such as event cameras, can allow drones to achieve unprecedented speed and robustness by relying solely on onboard computing. Watch the presentation! Enjoy!
April 19, 2021
We present Super-Human Performance in GTS Using Deep RL and Autonomous Overtaking in GTS Using Curriculum RL. Checkout the Website.
April 14, 2021
DSEC is a new driving dataset with stereo VGA event cameras, RGB
global shutter cameras and disparity
groundtruth from Lidar.
Download DSEC now to reap
the benefits of this multi-modal
dataset with high-quality calibration.
We also accompany the dataset with code and
documentation.
Check out our video,
and
paper
too! Stay tuned for more!
March 18, 2021
We present Autonomous Drone Racing with Deep RL, the first learning-based method that can achieve near-time-optimal performance in drone racing. Checkout the Preprint and the Video.
March 15, 2021
We organized a #ICRA2021 workshop on perception and action dynamic environments! We brought together amazing keynote speakers and also organized a competition on drone navigation in a forest (Prize is a Skydio2)! All we need is you! Check out our website here for more info and the current list of invited speakers.
March 8, 2021
Our work on Visual Processing and Control in Human Drone Pilots has been accepted in the IEEE Robotics and Automation Letters. Check out our Video, the Paper, and Open-Source Dataset too!
February 19, 2021
Our event camera simulator ESIM now features python bindings and GPU support for fully parallel event generation! Check out our project page, code and paper.
February 12, 2021
Our work on combining events and frames using recurrent asynchronous multimodal networks has been accepted in the IEEE Robotics and Automation Letters. Check out the paper, the project page, and the source code.
February 12, 2021
Our work on data-driven MPC for quadrotors has been accepted in the IEEE Robotics and Automation Letters. Check out the paper, the video, and the source code.
February 09, 2021
Our latest work on autonomous quadrotor flight despite rotor failure with onboard vision sensors (frames or event cameras) was featured on IEEE Spectrum. For more details, read the paper here and watch the video here. Source code here.
January 25, 2021
We are organizing the "3rd Workshop on Event-based Vision", which will take place in June at CVPR2021. The paper submission deadline is March 27. Check out our website here for more info and the current list of invited speakers.
January 14, 2021
Davide Scaramuzza and some of the lab's members talk about our work on drone racing in the new Flying Arena. Watch Davide Scaramuzza interview here. Watch Elia Kaufmann interview here. Watch Christian Pfeiffer interview here.
January 13, 2021
Our work on controlling a quadrotor after motor failure with only onboard vision sensors has been accepted in the IEEE Robotics and Automation Letters. Check out the paper, the video, and the source code.
January 12, 2021
Our work on generating accurate reference poses for visual localization datasets has been accepted in the International Journal of Computer Vision. Check out the paper here, and the Aachen Day-Night v1.1 dataset in the paper can be accessed via the online visual localization benchmark service.
January 11, 2021
We are super excited to announce SUIND, our latest spin-off! Leveraging years of research in our lab, SUIND is building a groundbreaking safety suite for drones. Proud to see our former members Kunal Shrivastava and Kevin Kleber making a true impact in the industry! Read more here.
December 4, 2020
Congratulations to Titus Cieslewski, who has successfully defended his PhD dissertation titled "Decentralized Multi-Agent Visual SLAM", on Nov. 30, 2020. We thank the reviewers: Prof. Marc Pollefeys and Prof. Torsten Sattler!
Titus' major contributions have been:
November 30, 2020
Autonomous cars equipped with event cameras are now possible with our new CARLA simulator plugin! Based on ESIM and available since CARLA 0.9.10. The sensor generates synthetic events in photorealistic self-driving scenarios at any temporal resolution. You might use our script in here to generate reliable data in CARLA and create your own dataset. A step forward to bring event cameras into autonomous driving research!
November 28, 2020
We have released the code of our 3DV 2020 paper Learning Monocular Dense Depth from Events, where we propose a supervised learning approach using a recurrent network to leverage the temporal consistency in the events and estimate single-camera dense depth! The code is available in GitHub!
November 18, 2020
Davide Scaramuzza talks about Learning to Fly at the Robotics Today's seminar series organized by MIT and Stanford. He covers topics ranging from perception to planning and control, from model-based to model-free autonomy. He shows how to learn sensorimotor policies end-to-end directly in simulation, which transfer to real drones without any fine-tuning, thanks to the use of appropriate sensory abstractions. He talks about the role of simulation. Finally, he shows the latest and greatest on event cameras to enable low-latency agile flight. Watch the presentation! Enjoy!
November 18, 2020
Are you curious about the people behind the robots? The 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) features a new Original Series called Real Roboticist hosted by Sabine Hauert, President of Robohub and faculty at University of Bristol, where she interviews different roboticists individually. Watch the one with Davide Scaramuzza!
November 2-3, 2020
Join our IROS 2020 workshop "Perception, Learning, and Control for Autonomous Agile Vehicles" on November 2 and 3 via ZOOM organized by Loianno Giuseppe, Davide Scaramuzza, and Sertac Karaman. It will cover both ground and flying super agile vehicles. We have an incredible line of speakers from both academia and industry! We will award a $500 USD to the best workshop paper. A Field Robotics Special Issue will be organized after the workshop (submissions open to everyone). WHEN: November 2nd and November 3rd, 2020, from 16:00hrs to 20:00hrs Zurich time (10am to 2pm New York time). WHERE: ZOOM. Workshop webpage.
October 26, 2020
We have just released the code of our NeurIPS 2020 paper Primal-Dual Mesh Convolutional Neural Networks. Our network architecture obtains state-of-the-art results in the tasks of shape classification and segmentation! If you are interested in 3D data processing and geometric deep learning, you should try our code out! The code is available at this page!
October 05, 2020
We release the open-source code for High-MPC: Learning High-level Policies for Model Predictive Control. Checkout the video and paper for more details.
September 26, 2020
We have three fully-funded openings for PhD students and Postdocs in computer vision and machine learning to contribute to the areas of:
September 21, 2020
The UZH FPV dataset has a new home at fpv.ifi.uzh.ch! This new website features an automatically evaluated benchmark. Submit your VIO output until September 27th to participate in the IROS 2020 FPV VIO competition!
September 08, 2020
We release the open-source code for Fisher Information Field - an efficient and differentiable map for perception-aware planning. Checkout the video and paper for more details.
September 03, 2020
We release our open-source quadrotor simulator Flightmare: A Flexible Quadrotor Simulator. Checkout the Website for more details.
July 27, 2020
Event cameras enable fast image processing and nimble movement. Here is the Link to the article.
July 26, 2020
Do you want to know how to make autonomous drones fly acrobatics maneuvers? Check out this blog post! Teaser: All you need is a drone simulator!
July 23, 2020
We released our open-source example implementation of VIMO extends VINS-Mono to estimate external forces along with robot state. Instructions to run the code are in the github repository.
July 16, 2020
Our paper AlphaPilot: Autonomous Drone Racing (Paper, Presentation) won the Best System Paper Award at RSS! Additionally, our paper Deep Drone Acrobatics (Paper, Presentation) was finalist for the Best Paper Award.
June 24, 2020
Is "robotics research over-reliant on benchmark datasets and simulation"? Check out here what RPG director Davide Scaramuzza has to say: Video. The debate had over 1100 online viewers!
June 11, 2020
Drones with on-board sensing and computation can now fly agile acrobatic maneuvers! Check out our RSS 2020 paper Deep Drone Acrobatics to understand what made this possible! A video of the experiments is available at this link. If you also want to do acrobatics with drones, please check out the project's code!
June 2, 2020
We have several fully-funded openings for PhD students and Postdocs in control, path planning, aerodynamic modelling, numerical optimization, computer vision, and machine learning to contribute to the areas of:
June 2, 2020
The winner is OKVIS 2.0 by the Smart Robotics Lab, Imperial
College, London, closely
followed by OpenVINS from the University of Delaware and with
OSU-ETHZ, a joint team from
the Ohio State University and ETH Zurich, on a close third rank.
See detailed
results on the dataset page.
May 28, 2020
Philipp Foehn and Davide Scaramuzza, from RPG, along with Varun Mural and Sertac Karaman, from MIT, organize the 2nd RSS Workshop on Perception and Control for Fast and Agile Super-Vehicles. We will have a great line of speakers from academia and industry but we also accept manuscripts.
Abstract deadline: June 14.
Notification of acceptance: June 21.
Workshop: July 12, 2020, virtual event.
Workshop webpage: https://mit-fast.github.io/WorkshopRSS20SuperVehicles/
Please email all submissions to super-vehicles-rss20-submit@mit.edu with RSS20 Super Vehicles in the subject line.
May 28, 2020
The competition will be held jointly with the 6th edition of the IROS 2020 Workshop on "Perception, Learning, and Control for Autonomous Agile Vehicles".
The participants will run their VIO algorithms on datasets (including images, IMU measurements and event data) recorded with a FPV drone racing quadrotor flown by an expert pilot with speeds up to and over 20m/s. More information at https://fpv.ifi.uzh.ch.
May 27, 2020
Tim Taubner, who did his Master thesis Competitive Drone Racing via Pass-Block Games at both Stanford University and RPG has received the ETH Medal 2020 and the Willi Studer Prize for the best student in the ETH Master Robotics, Systems and Control in the period March 2019-2020.
May 25, 2020
Check out our performance at the 2019 AlphaPilot Challenge now in video. Additionally, we present our paper at RSS 2020, describing our approach combining learning and model-based techniques to rank second in the 2019 AlphaPilot Challenge.
May 24, 2020
We released the code of our paper "Event-Based Angular Velocity Regression with Spiking Networks". The spiking neural network, implemented in PyTorch and CUDA, regresses the angular velocity of an event camera. Along with the code, there are instructions to download the datasets. Check out the github repository.
April 1, 2020
We have just released the code of our paper "Video to Events: Recycling Recycling Video Dataset for Event Cameras". Use our code implemented in CPP and Python to generate artificial events from standard video. The code can be found here.
March 30, 2020
Our new GPU-optmized FAST detector is available on github. It implements a GPU-specific novel non-maximum suppression and enhanced FAST detector, achieving over 1000fps on a Jetson TX2. Check out our paper Faster than FAST.
March 20, 2020
This documentary (German only) demonstrates our research on how autonomous drones can be used for search and rescue. (video starts at time 15:50)
March 18, 2020
Our Science Robotics paper on "Dynamic obstacle avoidance for quadrotors with event cameras" conquers the coverpage of Science Robotics March issue! PDF, Video.
March 8, 2020
Henri Rebecq is finalist in the 2019's edition of the George Giralt European PhD Thesis Award out of 60 applications.
February 18, 2020
We have openings at both PhD and Postdoctoral levels in Vision-based Control for Agile Flight, such as autonomous drone racing. More info here.
February 13, 2020
We have just released the code of our RA-L and ICRA paper A General Framework for Uncertainty Estimation in Deep Learning. Our framework can compute uncertainties for every network architecture, does not require changes in the optimization process, and can be applied to already trained architectures. Our framework's code is available at this page!
February 7, 2020
We are excited to release many driving datasets recorded in the context of our T-PAMI paper High Speed and High Dynamic Range Video with an Event Camera. The datasets consist of a number of sequences that were recorded with a VGA (640x480) event camera (Samsung DVS Gen3) and a conventional RGB camera (Huawei P20 Pro) placed on the windshield of a car driving through Zurich. The driving datasets are available at this page.
December 10, 2019
The European Research Council awards Davide Scaramuzza a Consolidator Grant (2 million Euros) for a research project that will use event cameras to improve the performance of flying robots in rescue operations. Press release by UZH Press release by the EU commission List of ERC grantees Description of the ERC program
December 6, 2019
The team composed by Dario Brescianini, Philipp Foehn, Elia Kaufmann and Mathias Gehrig ranks 2nd in the AlphaPilot Autonomous Drone Racing World Championship! Congratulations! (UZH News, HeroX)
December 5, 2019
Congratulations to Davide Falanga, who has successfully defended his PhD dissertation titled "Agile, Vision-Based Quadrotor Flight: from Active, Low-Latency Perception to Adaptive Morphology", on Dec. 2, 2019. We thank the reviewers: Prof. Nathan Michael, Prof. Sami Haddadin and Prof. Roland Siegwart!
December 1, 2019
Kunal Shrivastava and Kevin Kleber receive 140k CHF from the SNSF BRIDGE and Venture Kick funds for translating their research into product! Congratulations!
November 21, 2019
Congratulations to Henri Rebecq, who has successfully defended his PhD dissertation titled "Event Cameras, from SLAM to High Speed Video", on Nov. 18, 2019. We thank the reviewers: Prof. Andrew Davison, Prof. Tobi Delbruck and Prof. Bernt Schiele!
October 28, 2019
We are proud to announce that on November 5, 2019, Davide Scaramuzza will deliver a keynote talk at IROS 2019 in Macau.
October 28, 2019
EKLT, our event-based feature tracking method is now available open source. By leveraging the complementarity of event cameras and standard cameras EKLT achieves unprecedented tracking accuracy with high temporal resolution. https://github.com/uzh-rpg/rpg_eklt
October 15, 2019
Davide Scaramuzza and Giuseppe Loianno organize the 5th IROS
workshop on vision-based
drones! We have a great line of speakers from academia and
industry! Workshop
webpage.
October 12, 2019
Oculus Zurich (former Zurich Eye) soon expanding to 200
employees (currently 80)! Very proud
of you guys! Article
Handelszeitung.
September 4, 2019
Our foldable drone, the first quadrotor that can change its
shape and size in flight, is the
winner of the Aerospace and Defense category in the 2019 NASA
Tech Briefs "Create the
Future" contest. Check out the winners
list.
September 4, 2019
We are proud to announce that Prof. Davide Scaramuzza has been elected to the grade of Senior Member of the IEEE. Congratulations!
September 2, 2019
We congratulate our former postdoc Guillermo Gallego who starts today to work as Associate Professor at TU Berlin.
Guillermo worked on event-based algorithms. His major contributions are the release of the first event-camera dataset, which has become a standard tool in the computer vision community, and a method, called "focus maximization", which solves multiple computer vision and machine learning problems with event cameras. He was also the main author of the survey paper on event-based vision. Guillermo's Personal homepage.
August 20, 2019
Davide Scaramuzza will deliver a keynote speech at the international conference on Field and Service Robotics, in Tokyo, on September 31. More information here.
August 26, 2019
Learn how Zurich-Eye, co-founded by former RPG members Christian Forster, Matia Pizzoli, and Manuel Werlberger contributed to the newly announced Oculus Insight and Oculus Quest (link)!
August 20, 2019
Our recent work on interest points, which focuses on minimal representations for relative pose estimation, is now available open source.
SIPs achieves high matching score at low point counts, but uses
existing descriptors for
matching:
https://github.com/uzh-rpg/sips2_open
IMIPs instead provides a set of points that implicitly match between
views, without the need
for descriptors:
https://github.com/uzh-rpg/imips_open
August 13, 2019
Our foldable drone was featured in a documentary by the BBC News Arabic. Check-out the video here.
July 29, 2019
The competition will be held jointly with the 5th edition of the IROS 2019 Workshop on "Challenges in Vision-based Drone Navigation", which will take place on November 8, 2019, in Macau.
The participants will run their VIO algorithms on datasets (including images, IMU measurements and event data) recorded with a FPV drone racing quadrotor flown by an expert pilot with speeds up to and over 20m/s. More information here.
July 29, 2019
The Swiss National Centre of Research (NCCCR) Robotics published a joint paper on the current state and future outlook of rescue robotics in the Journal of Field Robotics. Paper: PDF.
July 29, 2019
Our lab's work towards the Alphapilot autonomous drone racing competition was featured on Neuer Zurcher Zeitung (NZZ). Check out the article here for more details (only in German).
July 9, 2019
On June 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG), and Kostas Daniilidis (UPenn) organized the 2nd International Workshop on Event-based Vision and Smart Cameras at CVPR, Long Beach.
July 8, 2019
Manasi Muglikar, PhD student in our lab, won the ETH Robotics Summer School Robot Competition. Congratulations!
July 8, 2019
Game of Drones is a NeurIPS 2019 competition with the goal to push
the boundary of building
competitive autonomous systems through head-to-head drone races.
Check out the official
webpage for further details.
June 10, 2019
On June 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG), and Kostas Daniilidis (UPenn) are organizing the 2nd International Workshop on Event-based Vision and Smart Cameras at CVPR in Long Beach.
Check out the schedule, accepted papers and live demos for this full-day workshop. We will have top speakers from both academia and industry (Samsung, Intel, Prophesee, iniVation, Insightness, CelePixel).
June 10, 2019
We are proud to announce that our lab is one of the nine teams that were accepted into the 2019 AlphaPilot Innovation Challenge, where we will compete to design an AI framework capable of piloting racing drones through high-speed aerial courses without any GPS, data relay or human intervention. The competition has a $1 million cash prize, sponsored by Lockheed Martin. Check out the official press release for further information.
June 5, 2019
Check out Prof. Davide Scaramuzza's interview with IEEE Spectrum about the Alphapilot competition, why drone racing matters for robotic research, and our recently released UZH FPV Drone Racing Dataset.
May 24, 2019
Zichao Zhang, PhD student in our lab, received the Best Paper Award at the ICRA 2019 Workshop on SLAM Benchmarking in Montreal on May 24 with his paper titled "Rethinking Trajectory Evaluation for SLAM: a Probabilistic, Continuous-Time Approach"! Congratulations!
May 22, 2019
Our paper UltimateSLAM received the IEEE Robotics and Automation Letters 2018 Best Paper Award Honourable Mention during the award session at the ICRA 2019 conference in Montreal. It ranked in the top 3 out of 520 papers published by RAL in 2018. Read the paper here and watch the video here for more details.
May 21, 2019
We are happy to announce the release of the UZH-FVP Drone Racing Dataset. It contains over 30 sequences of data from event cameras, standard cameras, IMU, and ground truth recorded by an FPV drone flown up to over 20m/s by professional drone pilots in real-world scenarios! Check out the official web page for more details.
May 13, 2019
Our latest work on quadrotor flight with event cameras was featured on IEEE Spectrum. For more details, read the paper here and watch the video here.
May 13, 2019
In collaboration with researchers at the digital Circuits and Systems lab at ETH, we have deviced a nano-drone (few centimeters in diameter) which can navigate in indoor environments with only onboard sensing and computing. Video, Paper, Code.
May 10, 2019
We welcome Dr. Dimche Kostadinov as new Postdoc and Thomas Längle as new Drone Engineer in our lab!
May 8, 2019
Daniel Gehrig, former Master student and current PhD student in our lab, won the ETH Medal for his outstanding Master thesis! Congratulations! Check out his ECCV'18 paper here, which is based on his Master thesis.
May 8, 2019
Our paper "Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios" was nominated finalist for the 2018 IEEE Robotics and Automation Letters Best Paper Award. Our paper is in the top 3 out of 520 papers published by RAL in 2018. Read the paper here and watch the video here.
May 7, 2019
Our paper "How Fast is Too Fast? The Role of Perception Latency in High-Speed Sense and Avoid" has been accepted for publication in the Robotics and Automation Letters (RA-L) 2019. We analyze the role of perception latency, sensing range and actuation limitatations on the maximum speed a robot can reach to safely navigate in an unknown environment. Our analysis is supported by experimental evaluation, where a quadrotor equipped with an event camera is able to avoid an obstacle moving towards it at 10 m/s. Read the paper here and watch the video here.
April 23, 2019
Guillermo Gallego, Davide Scaramuzza and 10 other international experts wrote a joint, 25-page-long survey paper on event-based cameras, from their working principle to algorithms and applications. Read the paper here.
April 09, 2019
Our work on the foldable drone, the first quadrotor able to change morphology in flight to adapt its shape and size to different tasks, won the Drone Hero Award Contest 2019 for the category Innovative Drone. Read the paper here and watch the video here.
March 26, 2019
Our research on autonomous drone racing was featured on The New York Times: "A drone from the University of Zurich is an engineering and technical marvel...". Check out the article!
March 15, 2019
Call for papers and demos! On June 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG), and Kostas Daniilidis (UPenn) will organize the 2nd Workshop on Event-based Vision and Smart Cameras at CVPR in Long Beach.
Check out our speakers lineup.
March 11, 2019
Our lab received an impressive media coverage during 2018, with more than 200 million readers across the world.
February 25, 2019
We release a framework to evaluate feature tracking for an event
camera.
The code
provided is
implemented in Python and produces paper-ready plots
and videos
for event-based feature tracks.
Paper, YouTube, Code on Github.
January 29, 2019
Our paper titled "The Foldable Drone: A Morphing Quadrotor that can Squeeze and Fly" received great attention from the media. It was covered by several newspaper and magazines, among which: TechCrunch, The Verge, CNET, La Repubblica, Tages Anzeiger, Popular Mechanics and IEEE Spectrum. Read the paper here and watch the video here.
January 18, 2019
On June 16 and 17, 2019, Davide Scaramuzza (RPG), Guillermo Gallego (RPG) and Kostas Daniilidis (UPenn) will organize a workshop at CVPR in Long Beach about Event-based Vision.
Check out the speakers lineup on the workshop website.
December 18, 2018
We release the code for Event-based Multi-View Stereo (EMVS):
3D reconstruction with an event
camera.
The code provided
is implemented in C++
and produces accurate, semi-dense depth
maps without requiring any explicit data association or
intensity estimation.
The code runs in real-time on a CPU.
Paper, YouTube, Code on Github.
December 13, 2018
Our paper The Foldable Drone: A Morphing Quadrotor that can Squeeze and Fly has been accepted for publication in the Robotics and Automation Letters. Read the paper here and watch the video here.
December 6, 2018
We are excited to announce that our RPG control framework gets a new addition: our Perception-Aware Model Predictive Control (PAMPC) is opensource and the source code is available here. PAMPC combines control and planning in one solution and allows to not only achieve an action objective but also compromise it with a perception objective. Further details are available in our paper presented at IROS 2018. Video.
November 15, 2018
We have several openings at both PhD and Postdoctoral levels in Robotics, Machine learning, Reinforcement Learning, Control, Computer Vision, Event Cameras, and beyond. Info and how to apply here.
November 3, 2018
We release ESIM, our new event
camera
simulator. ESIM can simulate events accurately and efficiently,
as
well as other sensors such as a conventional camera (including
motion blur!),
and an inertial measurement unit (IMU).
ESIM readily provides ground truth depth and optic flow maps.
Multiple
rendering engines are available,
including a photorealistic rendering engine based on Unreal
Engine, and a
fast 3D engine based on OpenGL that can simulate
events in real-time.
Paper, YouTube, Project Page, Code on Github.
November 1, 2018
Our
paper
Deep Drone Racing: Learning
Agile Flight in Dynamic
Environments won the Best Systems Paper Award at the
Conference on Robotic Learning
(CoRL) 2018!
October 16, 2018
Upon
large requests, we decided to
release on Arxiv the PDF of the paper describing the approach
with which we won the IROS
2018 Autonomous Drone Race. Our approach fuses deep learning and
optimal control to achieve
the ultimate flight performance. For these reasons we titled the
paper: Beauty and the
Beast. Who is the Beauty and who is the Beast? ;-) Paper,
YouTube.
October 3, 2018
We are
proud to announce that our team won the IROS Autonomous Drone
Race Competition, passing all
8 gates in just 30 seconds! In order to succeed, we combined
deep networks, local VIO,
Kalman filtering, and optimal control. Watch our performance here.
September 26, 2018
Mark
Zuckerberg just announced the new Oculus VR headset, called
Oculus Quest. This is what our
former lab startup, Zurich Eye, now Oculus Zurich has been
working on for the past two
years. Watch the video.
September 21, 2018
On
October 5, 2018, Giuseppe Loianno (New York University), Davide
Scaramuzza (RPG), and Vijay
Kumar (UPenn) will organize a workshop at IROS in Madrid about
"Vision-based Drones: What's
Next?". Check out the speakers lineup on the workshop
website.
September 21, 2018
Our lab
will partecipate in the IROS 2018 Autonomous Drone Race in
Madrid. Further details are
available here.
September 2, 2018
We
performed a live quadrotor demo at the Zurich Kunsthalle during
the Langen Nacht der
Zurcher Museen, as part of the 100 Ways of
Thinking show, in front of more
than 200 people. Check out the media coverage here.
August 30, 2018
Our research was feature on Neue Zucher Zeitung. Check out the article here.
August 28, 2018
Our lab received great Swiss media attention (NZZ, SwissInfo, SRF) for our live flight demonstration of a quadrotor entering a collapsed building to simulate a search and rescue operation. Check out the video here.
August 7, 2018
Our live demo of a quadrotor entering a collapsed building through a narrow gap was featured on the website of the Swiss Federal Office for Deference Procurement (armasuisse). More details are available here.
July 5, 2018
Facebook-Oculus Zurich, former Zurich-Eye, keeps expanding in Zurich: 35 employees and growing at a rate of 3 new people per month. RPG is very proud of them! More info here.
July 25, 2018
Our papers on Asynchronous feature tracking using events and frames and on Stereo 3D reconstruction for SLAM have been accepted at ECCV 2018 in Munich! Check out our research page on event-based vision.
July 10, 2018
Our research on autonomous drone racing was featured on NewScientist. Check out the article here.
June 13, 2018
Our paper about safe quadrotor navigation computing forward reachable sets was accepted for publication in the Robotics and Automation Letters (RA-L) 2018. Check out the PDF.
June 11, 2018
Our paper about drone racing was accepted to RSS 2018 in Pittsburgh! Check out the long version, short version and the video!
June 10, 2018
Our paper on Continuous-Time Visual-Inertial Odometry for Event Cameras has been accepted for publication at Transactions of Robotics. Check out the paper.
June 1, 2018
We welcome Dr. Dario Brescianini as new Postdoc in our lab!
May 28, 2018
Our paper on IMU pre-integration received the 2017 IEEE
Transactions on Robotics (TRO) best
paper award at ICRA 2018 in Brisbane, Australia. Check out the
paper here. Press
coverage!
May 14, 2018
We are proud to announce that our paper on IMU pre-integration will receive the 2017 IEEE Transactions on Robotics (TRO) best paper award. On this occasion, IEEE made the article open access for the next ten years! Press coverage
May 11, 2018
Our papers on A unifying contrast maximization framework for event cameras and on Steering angle prediction for self-driving cars with event cameras have been accepted at CVPR 2018 in Salt Lake City! Check out our research page on event-based vision.
May 10, 2018
Henri Rebecq, a PhD student in our lab, won a Qualcomm
Innovation Fellowship with his proposal "Learning
Representations for Low-Latency
Perception with Frame and Event-based Cameras"!
April 26, 2018
We are happy to announce a Python/Tensorflow port of the FULL NetVLAD network, approved by the original authors and available here (see also our software/datasets page). The repository contains code which allows plug-and-play python deployment of the best off-the-shelf model made available by the authors. We have thoroughly tested that the ported model produces a similar output to the original Matlab implementation, as well as excellent place recognition performance on KITTI 00.
April 24, 2018
We provide the code accompanying our recent Decentralized Visual SLAM paper. The code contains a C++/Matlab simulation containing all building blocks for a state-of-the-art decentralized visual SLAM system. Check out the paper, the Video Pitch, the presentation and the code.
April 23, 2018
We provide the code of our FAST event-based corner detector. Our implementation is capable of processing millions of events per second on a single core (less than a micro-second per event) and reduces the event rate by a factor of 10 to 20. Check out our Paper, video, and code.
April 23, 2018
We provide a complete framework for flying quadrotors based on control algorithms developed by the Robotics and Perception Group. We also provide an interface to the RotorS Gazebo plugins to use our algorithms in simulation. Check out our software page and the Github repository for more details.
March 14, 2018
Henri Rebecq, PhD student in our lab, was finalist at the Qualcomm Innovation Award Fellowship.
March 14, 2018
Christian Forster, first PhD student to graduate from RPG, ranked second in the 2017's edition of the George Giralt European PhD Thesis Award out of 41 applications.
March 14, 2018
Our recent work on how to teach a drone to fly autonomously and safely in the streets of a city (PDF) received a huge media coverage. Check out our media page.
March 07, 2018
Check out our publications page. Link from the PATENTSCOPE database.
March 02, 2018
RPG impresses Swiss minister Schneider-Ammann and other federal and cantonal representatives at the inauguration of the Innovation Park Switzerland. Check out here.
January 23, 2018
Check out the video here.
January 23, 2018
We have deviced a Deep Neural Network, called DroNet, that teaches a drone how to fly autonomously and safely in the streets of a city, among other vehicles, by imitating the behavior or cars and bicycles! Video, Paper, Datasets.
January 1, 2018
We welcome Dr. Peng Lu as new Postdoc in our lab!
December 14, 2017
Our work on differential flatness of quadrotor dynamics subject to rotor drag has been accepted for publication in the Robotics and Automation Letters. Read the paper here and watch the video here.
December 14, 2017
Our group took part to the NCCR Robotics integrative demo of aerial and terrestrial robots for rescue missions. Check out the video here.
December 4, 2017
We welcome Mathias Gehrig as new PhD student in our lab!
November 24, 2017
In this clip, we summarize our main achievements, projects, awards, exhibitions, and upcoming videos! Watch our YouTube video!
November 7, 2017
Our work on 3D reconstruction with an event camera in real-time has been accepted for publication in the International Journal of Computer Vision. Read the paper here.
October 31, 2017
Our work on 6-DOF ego-motion estimation with an event camera has been accepted for publication in IEEE Trans. Patter Analysis and Machine Intelligence. Check out the paper here.
October 11, 2017
Our recent work on decentralized visual place recognition from full-image descriptors was accepted to MRS 2017, the first international symposium on multi-robot and multi-agent systems!
Check out the paper here.
October 9, 2017
We ranked 2nd at the IROS 2017 Autonomous Drone Race in Vancouver. Check out the video of our performance here and the official website of the competition here.
October 9, 2017
Our paper "Rapid Exploration with Multi-Rotors: A Frontier Selection Method for High Speed Flight" was nominated finalist for the Best Paper Award on Safety Security and Rescue Robotics at IROS 2017 in Vancouver. Check out the paper here.
October 6, 2017
Check out the article here.
October 5, 2017
Our latest work on quadrotor flight with event cameras was featured on IEEE Spectrum. For more details, check our research page.
September 22, 2017
Davide Scaramuzza talks about autonomous, agile, vision-controlled drones and event cameras at the GeorgiaTech robotics seminar series: check out the video here.
September 20, 2016
This is the first ever autonomous flight with an event camera, which demonstrates agile maneuvers and flying in low-light environments. Possible applications could include supporting rescue teams with search missions at dusk or dawn. Check out the press release, our video and our paper.
September 18, 2017
Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the fourth international workshop on Vision-based High Speed Autonomous Navigation of UAVs.
August 30, 2017
Davide Scaramuzza was appointed Tenured Associate Professor with double affiliation with the Insitute of Neuroinformatics of the University of Zurich and ETH Zurich. Check out the news here.
August 29, 2017
We have a new opening in our team for a Drone Research Engineer. See our open positions for more details.
August 29, 2017
We are happy to announce that our lab is now affiliated with the Institute of Neuroinformatics (INI), a joint institute belonging to the University of Zurich and ETH Zurich.
August 11, 2017
Philipp Foehn, PhD student in our lab, and Naveen Kuppuswamy, former visiting researcher, received the Best Student Paper Award Finalist prize at RSS 2017 in Boston for our work on trajectory optimization for agile quadrotor maneuvers with cable-suspended payloads. Check out the paper here and the video here.
July 14, 2017
We are happy to release the binaries of our Semi-Direct Visual Odometry, SVO 2.0. It can run up to 400 frames per second on a modern laptop and execute in real-time on a smartphone processor. The binaries can be found here.
July 12, 2017
We are happy to start The
List of Event-based Vision Resources,
which contains links to event camera devices as well as papers,
videos, code, etc.
describing the algorithms and systems developed using this
exciting technology.
We hope the list will help newcomers to the field to get started
with this technology by
directing them to the appropriate references.
Help us improve the list by adding more entries! (Please follow
the "Contributing"
guidelines).
June 21, 2017
We are happy to announce the release of the code for recovering the brigthness map that caused the events to be triggered. The code can be found here.
June 20, 2017
Our paper Fast Trajectory Optimization for Agile Quadrotor Maneuvers with a Cable-Suspended Payload, accepted for oral presentation at RSS'17, was nominated as a finalist for the Best Student Paper Award! Check out the paper here and the video here.
June 15, 2017
The slides and videos of the talks of the International Workshop on Event-based Vision are now available at the workshop website and the RPG Workshops Youtube channel.
June 15, 2017
We are happy to announce the release of the code for event lifetime. The lifetime of an event is the time that it takes for the moving brightness gradient causing the event to travel a distance of 1 pixel. More details in the following ICRA publication. The code can be found here.
May 12, 2017
Prof. Davide Scaramuzza gave a talk at the Maker Festival in Terni about the role of computer vision in autonomous cars and drones. Check the news here (Italian).
May 5, 2017
The Robotics and Perception Group wins the 2017 Misha Mahowald Prize, which recognizes outstanding achievement in the field of neuromorphic engineering. Check out the press release here. Watch out our summary video on Event-based Vision for Autonomous High-Speed Robotics.
April 3, 2017
Our recent work comparing volumetric information gain metrics for object reconstruction is part of the Autonomous Robots special issue on Active Vision.
Check out the paper here.
April 20, 2017
Antonio Loquercio, PhD student in our lab, won the ETH Medal for his outstanding Master thesis! Congratulations!
April 4, 2017
We are happy to announce the release of the first public, large-scale dataset recorded with a drone in an urban environment at low altitudes (5-15m). The 2 km dataset consists of time synchronized aerial high-resolution images, GPS and IMU sensor data, ground-level street view images, and ground truth data. The dataset is ideal to evaluate and benchmark appearance-based localization, monocular visual odometry, simultaneous localization and mapping (SLAM), and online 3D reconstruction algorithms for MAVs in urban environments. Go to the dataset webpage.
April 3, 2017
Our paper on Collaborative Transport with Multiple MAVs appeared on Discovery Channel Canada. Check out the paper here.
March 31, 2017
We welcome Dr. Suseong Kim, as new Postdoc, and Philipp Foehn, as new PhD student, in our lab!
March 24, 2017
Fotokite, the Swiss startup developing tethered drones, which was incubated within RPG in 2014 through the NCCR Spin Fund, won the 2017 EUrobotics Tech Transfer Award! Congratulations! We are very proud of you!
February 24, 2017
Davide Scaramuzza's seminar on visual-inertial state estimation, active vision, and event-based vision at CMU was featured on IEEE Spectrum. Watch the video on YouTube.
February 17, 2017
Davide Scaramuzza talks about visual-inertial state estimation, active vision, and event-based vision at CMU Robotics Institute Seminar Series: YouTube, Abstract.
February 16, 2017
Our research on autonomous drones was featured on the University of Zurich Journal, ranking 3rd on the list of the most popular news release of 2016. Check out the UZH Journal on Page 5.
February 15, 2017
Dacuda's 3D Division, a long-standing collaborator of RPG, got acquired by Magic Leap, the unicorn of Augmented Reality (news here). One amazing result of our collaboration was a software that runs on a smartphone and delivers at the same time an immersive virtual-reality experience like high-end VR headsets. This software was the result of a great project with RPG, which was demoed at CES 2017 (LINK). Congratulations guys!.
February 15, 2017
Davide Scaramuzza, Andrea Censi (MIT), and Guillermo Gallego (RPG-UZH) are organizing the first International Workshop on Event-based Vision.
Febraury 6, 2017
For info and applications, please see here.
February 1, 2017
Welcome to Antonio Loquercio as a new PhD student in our lab!
January 9, 2017
Our recent work on decentralized visual place recognition using a distributed inverted index was accepted to RA-L!
Check out the paper here.
December 20, 2016
Our recent work on real-time parallel tracking and mapping with an event camera was accepted to RA-L!
Check out the video here, and the paper here.
December 20, 2016
Our recent work on motion estimation with an event camera by contrast maximization was accepted to RA-L!
Check out the video here, and the paper here.
December 14, 2016
Our recent work on quadrotor flight through narrow gaps using only onboard sensing and computing is featured on MIT Technology Review. Click here to read the article.
December 3, 2016
Our recent work "Aggressive Quadrotor Flight through Narrow Gaps witn Onboard Sensing and Computing" is available on Arxiv for download. [Link]
November 19, 2016
Our lab was featured in the 2016 World Robotics report of the International Federation of Robotics as outstanding profile of research lab in service robotics. Check out the report here.
November 14, 2016
We have several open PhD student and Postdoc positions in Deep Learning, Control, and Robot Vision for Agile, Vision-based Quadrotor Flight. For more info and applications, please see here.
November 11, 2016
Zurich-Eye, the Wyss-Zurich project co-founded in Sep. 2015 by former RPG members Christian Forster (author of SVO), Matia Pizzoli (author of REMODE), and Manuel Werlberger, gets featured in the Swiss news.
October 27, 2016
We are happy to announce the release of the first public datasets recorded with an event camera (DAVIS) for pose estimation, visual odometry, and SLAM applications! The data also include intensity images, inertial measurements, ground truth from a motion-capture system, synthetic data, as well as an event camera simulator! We believe that event cameras will allow future robots to move faster and more agilely. Find out more on the dataset website!
October 24, 2016
Our former student and current Research Assistant Timo Horstschäfer won the Fritz Kutter Award for Industry Related Thesis in Computer Science with his Master Thesis "Parallel Tracking, Depth Estimation, and Image Reconstruction with an Event Camera". This is the second time in two years that an RPG master student wins this prestigious award! Congratulations!
October 14, 2016
Our paper Low-Latency Visual Odometry using Event-based Feature Tracks was nominated as Finalist for the Best Application Paper Award at IROS 2016. Also, it was selected as highlight oral talk, with an acceptance rate of 0.4%. Congratulations to Beat Kueng, Elias Mueggler and Guillermo Gallego!
October 14, 2016
Our third international workshop on Vision-based High Speed Autonomous Navigation of UAVs, co-organized by Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) featured an impressive line of renowned speakers, live demos, industries, and attracted more than 200 people worldwide!
October 5, 2016
Our latest work on quadrotor flight through narrow gaps was featured on IEEE Spectrum and Robohub. For more details, check our research page.
October 3, 2016
We welcome Rubén Gómez Ojeda from University of Málaga as new visiting PhD student in our lab!
September 27, 2016
Check out our latest work on agile quadrotor flight through
narrow gaps with onboard sensing
and computing: LINK
September 22, 2016
Our paper EMVS: Event-based
Multi-View Stereo, receives
BMVC'16 Best Industry Paper Award! Congratulations to Henri
Rebecq and Guillermo Gallego!
September 14, 2016
Zurich-Eye, the Wyss-Zurich project co-founded in Sep. 2015 by former RPG members Christian Forster (author of SVO), Matia Pizzoli (author of REMODE), and Manuel Werlberger, is now part of Oculus VR Zurich. RPG is very proud of them! This highlights the importance and impact of the great work they have done!
September 12, 2016
Our paper "EMVS: Event-based Multi-View Stereo" about monocular
3D reconstruction using an
event camera has been accepted for oral presentation at BMVC'16!
Check out our publication list.
September 2, 2016
Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the third international workshop on Vision-based High Speed Autonomous Navigation of UAVs
September 1, 2016
Welcome to Alessandro Simovic as a new drone engineer in our lab!
September 1, 2016
Welcome to Stefano Ghidoni from the University of Padua as a visiting assistant professor in our lab!
August 19, 2016
RPG and IDSIA have collaborated
on a new paper, which
will be presented at ISER 2016. We show that we can train a
terrain classifier for search
and rescue scenarios while our quadrotor is in flight, in
only one minute! Results
can be seen in this YouTube video,
and details
can be found in our publication
list.
August 10, 2016
Cesar Cadena, Luca Carlone Henry Carrillo, Yasir Latif, Davide Scaramuzza, Jose Neira, Ian Reid, and John Leonard have co-authored a paper on Simultaneous Localization And Mapping: Present, Future, and the Robust-Perception Age. Check it out!
July 25, 2016
Zichao Zhang and Titus Cieslewski were in the reading group that
won the reading group
competition at the International
Computer
Vision Summer School. One of the main ingredients
for this success was an
experimental literature visualization tool developed at the RPG
group retreat.
July 20, 2016
We have an open position in our team for a Drone Research Engineer. Check out our open positions.
May 25, 2016
The software package corresponding to the paper An Information Gain Formulation for Active Volumetric 3D Reconstruction is now available from our Github page. This is a general, open-source, framework for volumetric reconstruction that is object, sensor, and robot-agnostic. Some results can be see in this YouTube video.
May 23, 2016
Welcome to Dr. Naveen Kuppuswamy, our new visiting researcher from Toyota Research Institute Boston (MA)!
May 10, 2016
Elias Mueggler, a PhD student
in our lab, won a Qualcomm
Innovation Fellowship with his proposal "Event-based
Vision for High-Speed
Robotics"!
February 12, 2016
We received huge media coverage for our research on autonomous drone navigation in the forests using Deep Neural Network; among these, Discovery Channel Canada and NBC News.
February 11, 2016
Our recent work on autonomous navigation in the forests using Deep Neural Networks makes it to IEEE Spectrum and Robohub.
February 10, 2016
Our recent work on autonomous navigation in the forests using Deep Neural Networks makes it to the Swiss National TV News channel: SRF Tagesschau.
February 10, 2016
This research appeared in the IEEE Robotics and Automation Letter, will be presented at the IEEE International Conference on Robotics and Automation (ICRA'16) and is nominated for the best AAAI video Award. Journal paper. More info. YouTube video.
January 15, 2016
Davide Scaramuzza was appointed Associate Faculty at Wyss Zurich, the new translational center of UZH and ETH Zurich dedicated to regenerative and robotics technologies.
January 14, 2016
Titus Cieslewski, a PhD student in our lab, has received a prize from Homegate during the HackZurich hackathon, for the project Wonsch. He and his team were among the top 3 of the 20 teams that participated in the Homegate challenge during the hackathon.
January 1, 2016
Welcome to Titus Cieslewski as a new PhD student in our lab!
January 1, 2016
NCCR Robotics interviews Davide Scaramuzza about how his academic path, from working as a magician in theaters and public squares to pay his undergraduate studies to becoming a robotics professor (LINK).
November 4, 2015
NCCR Robotics organized the Swiss Robotics Industry day at EPFL Lausanne. We showed the collaboration of a flying robot with a legged robot from ETH Zurich. Some highlights are shown in a video by Le Matin.
October 31, 2015
An interview with Davide Scaramuzza was published on Robots Podcast and Robohub. Check out our media page!
October 2, 2015
Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the third international workshop on Vision-based Control and Navigation of Small, Lightweight UAVs at IROS'15.
October 2, 2015
Our second international workshop on Vision-based Control and Navigation of Small, Lightweight UAVs, co-organized by Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) featured an impressive line of renowned speakers, live demos, industries, and attracted more than 200 people worldwide!
October 2, 2015
Davide Scaramuzza gives a tutorial on event-based vision at the IROS'15 workshop on Alternative Sensing for Robot Perception: Beyond Laser and Vision. The slides can be downloaded from here.
September 15, 2015
Welcome to Michael Gassner as a new research assistant in our lab!
September 5 and 6, 2015
The Robotics and Perception Group showed their research at Scientifica, the science fair of ETH and University of Zurich. 25.000 visitors attended the event. Have a look at the gallery!
September 2, 2015
Giuseppe Loianno (UPenn), Davide Scaramuzza (RPG), and Vijay Kumar (UPenn) will organize the second international workshop on Vision-based Control and Navigation of Small, Lightweight UAVs at IROS'15.
September 1, 2015
Christian Forster (author of SVO), Matia Pizzoli (author of REMODE), and Manuel Werlberger create Zurich-Eye, an spinoff project of Wyss-Zurich dedicated to the commercialization of visual-inertial navigation solutions.
July 13, 2015
Christian Forster's RSS'15 paper is Best Paper Award Finalist at RSS'15!
June 15, 2015
Welcome to Henri Rebecq as a new PhD student in our lab!
May 30, 2015
Andrea Censi (MIT) and Davide Scaramuzza organized a workshop on Innovative Sensing for Robotics at ICRA'15!
April 14, 2015
Our latest work on failure recovery from agressive flight IEEE Spectrum. For more details, see the ICRA'15 paper and the accompanying video.
April 14, 2015
Welcome to Davide Falanga as a new PhD student in our lab!
March 20, 2015
RPG showcased its autonomous quadrotors and live 3D reconstruction at this year's CeBIT, the world's largest computer expo! See some pictures in our gallery.
February 1, 2015
In this clip, we summarize our main achievements, projects, awards, exhibitions, and upcoming videos! Watch our YouTube video!
November 21, 2014
Davide Scaramuzza wins ERC Starting Grant, through the Swiss National Science Foundation.
October 29, 2014
Our SSRR'14 paper on "Aerial-guided Navigation of a Ground Robot among Movable Obstacles" was selected as Finalist for the Best Paper Award. Our ICRA'14 paper "REMODE: Probabilistic, Monocular Dense Reconstruction in Real Time" was nominated as Finalist for the NCCR Best PostDoc Paper Award.
October 21, 2014
Our former Master student Basil Huber won the 2014 Fritz Kutter Award for Industry Related Thesis in Computer Science. His thesis was on High-Speed Pose Estimation using a Dynamic Vision Sensor. Congratulations!
October 7, 2014
Our latest work on event-based vision was featured on IEEE Spectrum. For more details, see the IROS'14 paper and the accompanying video.
October 1, 2014
We welcome Dr. Manuel Werlberger as new Postdoc in our lab! We are also happy to host Junije Zhang and Zichao Zhang as visiting PhD students.
September 6, 2014
We will demonstrate autonomous, vision-based flight and live dense 3D mapping with a quadrotor MAV at the ECCV workshop on Computer Vision in Vehicle Technology on Sep. 6 at 5:30pm. Watch here the video preview.
August 1, 2014
We welcome Dr. Guillermo Gallego and Dr. Jeff Delmerico as new Postdocs in our lab! We are also happy to host Antonio Toma, Gabriele Costante, Nathaly Gasparin, Ra'Eesah Mangera, Kumar Shaurya Shankar and Xin Yu as visiting students this year.
June 4, 2014
Davide Scaramuzza wins the 2014 IEEE Robotics and Automation Society Early Career Award "for his major contributions to robot vision and visually-guided micro aerial vehicles".
June 4, 2014
The Robotics and Perception Group wins the KUKA Innovation Award (20.000 EUR) with its demonstration of collaboration of flying and ground robots for search-and-rescue missions. Watch a video of the demo at AUTOMATICA.
June 2, 2014
The software corresponding to the paper SVO: Fast Semi-direct Monocular Visual Odometry can now be downloaded from our Github page. The source code is released under a GPLv3 licence. A professional edition license for closed-source projects is also available.
June 1, 2014
RPG appears among the best 12 European robotics success stories advertised by EU commission. Read the full article.
May 30, 2014
Our latest work on event-based vision was featured in the MIT News. For more details, see the ICRA'14 paper.
February 18, 2014
Many thanks to Google!
December 2, 2013
Daniel Buchmüller, co-founder and software engineer at Amazon Prime Air, visited us.
December 2, 2013
The Robotics and Perception Group was featured in the news programme 10vor10 of the Swiss National TV (SRF). Check the video!
November 26, 2013
Benjamin Keiser won the KUKA Best Student Project Award 2013 with his Master thesis Torque Control of a KUKA youBot Arm that he did with the Robotics and Perception Group. A demonstration of the capabilities of his controller is shown in this video.
November 20, 2013
The Robotics and Perception Group was featured in a documentary on the German-French TV channel ARTE. Check both the German and French version!
November 19, 2013
Henri Seydoux, CEO and founder of Parrot, the company making the popular toy quadrocopter AR.Drone, visited us.
November 6, 2013
Open position at the Robotics and Perception Group. Check here.
November 5, 2013
We are organizing the first international workshop on Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments, featuring amazing live flight demonstrations!
November 3, 2013
We will be presenting papers in sessions Localization II (Nov. 4) and Unmanned Aerial Vehicles IV (Nov. 5).
August 31, 2013
The report can be found here.
May 4, 2013
RPG will give two talks at ICRA'13: one by Christian Forster (ThDInt.14) and one by Davide Scaramuzza (link)
April 29, 2013
A documentary by the Swiss TV about our autonomous helicopters for search and rescue(link to original article)
April 20, 2013
The Swiss Robotics Festival is the largest Swiss Robotics exhibition, which this year attracted more than 20,000 participants. What some impressions from our demonstrations! (link)
April 1, 2013
Welcome to Flavio Fontana as a new PhD student in our lab!
March 31, 2013
Our Easter video featuring a ground and an aerial robot was featured in IEEE Spectrum News and Gizmodo, the famous technology review blogs. Read the articles: IEEE Spectrum and Gizmodo.
February 28, 2013
Weltwoche, a Swiss weekly magazine, talks about drones in the daily life. Read the article: "Der Spion von deinem Fenster" (German only).
December 1, 2012
Welcome to Chiara Troiani from INRIA, Volker Grabe from Max Planck Institute, and Damiano Verda from University of Genoa, who will join us for six months.
December 1, 2012
Welcome to Matthias Faessler and Elias Mueggler as new PhD students in our lab!
December 19, 2012
UZH and ETH Students are welcome to apply for a project at our lab. — More information
November 1, 2013
We welcome Dr. Matia Pizzoli and Dr. Andras Majdik, our two new postdocs, and to Yanhua Jiang and Volker Grabe, our new visiting PhD students!
September 16, 2013
Welcome to Dr. Andrea Censi, our new visiting postdoc for three months!
September 1, 2012
Prof. Davide Scaramuzza has been invited to give a talk at TEDxZurich on October 25, 2012.
July 7, 2012
Join the talk about Christian Forster's work on Collaborative Visual SLAM with Multiple MAVs at the RSS workshop on integration of perception with control and navigation for resource-limited, highly dynamic, autonomous systems.
May 17, 2012
Congratulations! Download the press release.
May 1, 2012
We welcome Christian Forster as new PhD student in our lab!
April 23, 2012
The European project sFly, coordinated by Davide Scaramuzza, gets a log of media attention and gets featured on IEEE News. Read it here.
June 11, 2020
Watch our drone flying very agile acrobatics maneuvers! Read our Deep Drone Acrobatics paper for further details.
May 25, 2020
Watch our performance scoring second at the AlphaPilot Challenge! Read our RSS 2020 paper for further details.
March 18, 2020
Watch our drone play dodgeball using an event camera! Read our Science Robotics paper for further details.
Jan 14, 2020
Watch how an event camera is used to reconstruct video at arbitrary frame rate and thus observe fast phenomena. Video reconstruction is done using a recurrent neural network trained only in sumulation! Read our T-PAMI paper for further details.
Oct 7, 2019
Watch how we taught a drone to race autonomously a track which it never saw before! The system runs fully onboard and is powered by a neural network trained on a non-photorealistic simulator, which was deployed on the real drone without any fine tuning! Read our T-RO paper for further details.
Oct 1, 2019
Watch this 27 gram nano drone avoid obstacles using a neural netowrk running on a 63mW Parallel Ultra Low Ppower Processor! Read our IEEE IoT journal paper for further details.
May 7, 2019
Watch an autonmous drone dodge a ball thrown at it at 10m/s. An event-based camera is used to detect the ball with millisecond latency. Read the paper for further details.
December 13, 2018
We present the first foldable drone that can guarantee stable flight with any configuration. It can squeeze to fly through narrow gaps. Read the paper for further details.
December 6, 2018
We release the code of Perception-Aware Model Predictive Control (PAMPC). PAMPC allows drones to navigate trajectory while keeping the visibility of a point of interest (a gate, a gap, texture). Paper.
October 16, 2018
We combined deep networks, local VIO, Kalman filtering, and optimal control to achieve ultimate speed for autonomous drone racing. Paper describing the approach.
October 3, 2018
Our performance at the IROS'18 Autonomous Drone Race Competition, where we won the 1st place passing all 8 gates in just 30 seconds and outracing the 2nd placing team by a factor of 2! We combined deep networks, local VIO, Kalman filtering, and optimal control. Paper describing the approach.
June 18, 2018
Event cameras allow predicting the steering angle of a car more robustly and accurately at night and high dynamic range scenes than a standard camera. Paper.
March 23, 2018
Watch the first ever autonomous quadrotor flight with an event camera using our UltimateSLAM. RAL'18 paper.
March 22, 2018
UltimateSLAM combines, images events, and IMU to achieve the ultimate visual SLAM performance: up to 85% accuracy improvement over VIO with standard cameras! Paper. Project webpage
January 23, 2018
DroNet is a Deep Neural Network architecture that makes drones able to fly autonomously and safely in the streets of a city, among other vehicles, by imitating the behavior of cars and bicycles! Video, Paper, Datasets.
May 19, 2017
Check out our latest work on active exposure control for robust visual odometry in high dynamic range environments: ICRA'17 paper.
May 5, 2017
We are awarded the 2017 Misha Mahowald Prize, which recognizes outstanding achievement in the field of neuromorphic engineering. Press release.
April 4, 2017
We release the first public, large-scale dataset recorded with a drone in an urban environment at low altitudes (5-15m). Dataset here.
December 20, 2016
Check out our latest work, EVO, on Event-based, 6-DOF Parallel Tracking and Mapping in Real-time: RA-L'16 paper.
December 20, 2016
Check out our latest work on rotational motion estimation with an Event Camera: RA-L'16 paper.
September 27, 2016
Check out our latest work on agile quadrotor flight through narrow gaps with onboard sensing and computing: More info here.
September 12, 2016
Check out our latest work on Event-based Multi-View Stereo, which uses a single, continuously moving event camera for accurate 3D reconstruction! BMVC'16 paper.
August 19, 2016
Our latest work on search and rescue robotics is a system for training a terrain classifier "on-the-spot" in only 60 seconds. Our flying robot can then use this classifier to guide a ground robot through a disaster area. Details are in our ISER'16 paper.
July 19, 2016
We designed an event-based 6-DOF pose tracking pipeline with the latency of 1 microsecond using the DVS sensor for very high speed (>500 deg/sec) and high-dynamic-range (> 130 dB) applications, where all standard cameras fail. All the details in our Arxiv paper.
July 19, 2016
We designed an event-based 6-DOF visual odometry pipeline with the latency of 1 microsecond using the DAVIS sensor. All the details in our IROS'16 paper and EBCCSP'16 paper.
February 10, 2016
We used Deep Neural Networks to teach our drones to recognize and follow forest trails to search for missing people. Journal Paper. More info.
May 25, 2016
Our active volumetric reconstruction software framework is now released open source. More details in our ICRA'16 paper.
March 30, 2015
Our latest work on failure recovery from aggressive flight and how to launch a quadrotor by throwing it in the air! ICRA'15 paper.
March 30, 2015
Our latest work on autonomous landing-site detection and landing with onboard monocular vision! ICRA'15 paper.
February 1, 2015
To celebrate our lab's 3-year anniversary, we summarize in this clip our main achievements, projects, awards, exhibitions, and upcoming videos!
October 24, 2014
Our latest work on Aerial-guided Navigation of a Ground Robot among Movable Obstacles. More details in our SSRR'14 paper.
October 7, 2014
Our latest work on Appearance-based Active, Monocular, Dense Reconstruction for Micro Aerial Vehicles. More details in our RSS'14 paper.
September 15, 2014
Our latest work on event-based vision: 6-DOF Pose Tracking for High-Speed Maneuvers. More details in our IROS'14 paper.
September 6, 2014
Our quadrotor demo trailer: autonomous navigation, live dense 3D reconstruction, and collaborative grasping.
June 6, 2014
Our demo at the KUKA Innovation Award that shows the collaboration of flying and ground robots for search-and-rescue missions
February 19, 2014
SVO - our new visual odometry pipeline for MAV state estimation. More details in our ICRA'14 paper.
February 19, 2014
Our latest work on probabilistic, monocular dense reconstruction in real time. More details in our ICRA'14 paper.
February 19, 2014
Our monocular pose estimation system that is released as open-source. More details in our ICRA'14 paper.
November 26, 2013
Torque Control of a KUKA youBot Arm (Master thesis of Benjamin Keiser)
November 20, 2013
RPG was featured on the German-French TV channel ARTE in their science programme X:enius. The French version is available here.
August 7, 2013
Watch the video for our new IROS'13 paper "Air-Ground Localization and Map Augmentation Using Monocular Dense Reconstruction".
August 7, 2013
Check out our new IROS'13 paper "MAV Urban Localization from Google Street View Data".
August 7, 2013
Watch the video for our new IROS'13 paper "Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles".
October 25, 2012
Autonomous Vision-Controlled Micro Flying Robots: Davide Scaramuzza at TEDxZurich.