Elias Mueggler

MSc ETH Zurich

Robotics and Perception Group

Department of Informatics

University of Zurich

Email: mueggler (at) ifi (dot) uzh (dot) ch

Office: Andreasstrasse 15, AND 2.16

I am a PhD student at the Robotics and Perception Group led by Prof. Davide Scaramuzza. Currently, I am working on event-based vision for high-speed robotics and air-ground robot collaboration. In 2010 and 2012, I received my Bachelor's and Master's degree in Mechanical Engineering from ETH Zurich, respectively. During my studies at ETH, I was focusing on robotics, dynamics, and computer vision. I wrote my Master thesis at MIT under the supervision of Prof. John J. Leonard on visual SLAM for space applications.

Link to my Google Scholar profile




Awards

  • Supervisor of Timo Horstschaefer, Winner of 2016 Fritz Kutter Award of ETH Zurich
  • Convergent Science Network of Biomimetics and Neurotechnology CapoCaccia Fellowship 2014
  • Supervisor of Basil Huber, Winner of 2014 Fritz Kutter Award of ETH Zurich
  • Supervisor of Benjamin Keiser, Winner of the 2013 KUKA Best Student Project
  • Hans und Wilma Stutz Foundation Scholarship 2012


Research Interests

Event-based Robot Vision


Unlike a standard CMOS camera, a DVS does not wastefully send full image frames at a fixed frame rate. Conversely, similar to the human eye, it only transmits pixel-level brightness changes at the time they occur with microsecond resolution, thus, offering the possibility to create a perception pipeline whose latency is negligible compared to the dynamics of the robot. We exploit these characteristics to estimate the pose of a quadrotor with respect to a known pattern during high-speed maneuvers, such as flips, with rotational speeds up to 1,200 degrees a second. We presented our work at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in 2014. It was also featured on IEEE Spectrum.

Please find more details about our research on event-based vision here.

References

IJRR_Mueggler

E. Mueggler, H. Rebecq, G. Gallego, T. Delbruck, D. Scaramuzza

The Event-Camera Dataset and Simulator: Event-based Data for Pose Estimation, Visual Odometry, and SLAM

PDF (arXiv) YouTube Dataset


Low-Latency Visual Odometry using Event-based Feature Tracks

B. Kueng, E. Mueggler, G. Gallego, D. Scaramuzza

Low-Latency Visual Odometry using Event-based Feature Tracks

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, 2016.

PDF YouTube


Feature Detection and Tracking with the Dynamic and Active-pixel Vision Sensor (DAVIS)

D. Tedaldi, G. Gallego, E. Mueggler, D. Scaramuzza

Feature Detection and Tracking with the Dynamic and Active-pixel Vision Sensor (DAVIS)

International Conference on Event-Based Control, Communication and Signal Processing (EBCCSP), Krakow, 2016.

PDF


ECMR2015_Mueggler

E. Mueggler, N. Baumli, F. Fontana, D. Scaramuzza

Towards Evasive Maneuvers with Quadrotors using Dynamic Vision Sensors

European Conference on Mobile Robots (ECMR), Lincoln, 2015.

PDF


ISCAS15_Delbruck

T. Delbruck, M. Pfeiffer, R. Juston, G. Orchard, E. Müggler, A. Linares-Barranco, M. W. Tilden

Human vs. computer slot car racing using an event and frame-based DAVIS vision sensor

IEEE International Symposium on Circuits and Systems (ISCAS), Lisbon, 2015.

YouTube


RSS2015_Mueggler

E. Mueggler, G. Gallego, D. Scaramuzza

Continuous-Time Trajectory Estimation for Event-based Vision Sensors

Robotics: Science and Systems (RSS), Rome, 2015.

PDF


ICRA2015_Mueggler

E. Mueggler, C. Forster, N. Baumli, G. Gallego, D. Scaramuzza

Lifetime Estimation of Events from Dynamic Vision Sensors

IEEE International Conference on Robotics and Automation (ICRA), Seattle, 2015.

PDF


IROS2014_Mueggler

E. Mueggler, B. Huber, D. Scaramuzza

Event-based, 6-DOF Pose Tracking for High-Speed Maneuvers

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Chicago, 2014.

PDF YouTube


Air-Ground Collaboration




We develop strategies for aerial and ground robots to work together as a team. By doing so, the robots can profit from each others capabilites. Our demonstration won the KUKA Innovation Award 2014 and was presented in a paper at the IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR) in 2014.




We released this video on Easter 2013. A quadrotor is flying above a ground robot while looking for Easter eggs that lie on the ground. It then tells the ground robot the exact position of these eggs, so that all of them can be collected. This video was accepted for the video session at the International Joint Conference on Artificial Intelligence (IJCAI) 2013 in Beijing, China.


References

SSRR2014_Mueggler

E. Mueggler, M. Faessler, F. Fontana, D. Scaramuzza

Aerial-guided Navigation of a Ground Robot among Movable Obstacles

IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Toyako-cho, 2014.

PDF YouTube Presentation at AUTOMATICA


ICRA2014_Faessler

M. Faessler, E. Mueggler, K. Schwabe, D. Scaramuzza

A Monocular Pose Estimation System based on Infrared LEDs

IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, 2014.

PDF YouTube



Supervised Students

  • Timo Horstschaefer: Parallel Tracking, Depth Estimation, and Image Reconstruction with an Event Camera (Master Thesis 2016, Winner of the Fritz Kutter Award 2016)
  • Jonathan Huber: Ground Robot Localization in Aerial 3D Maps (Semester Thesis 2016)
  • Julia Nitsch: Terrain Classification in Search-and-Rescue Scenarios (NCCR Internship 2016)
  • Beat Kueng: Visual Odometry pipeline for the DAVIS camera (Master Thesis 2016) [ PDF ] [ Video ]
  • Mathis Kappeler: Exposure Control for Robust Visual Odometry (Master Project 2016)
  • Imanol Studer: Head Pose Tracking with Quadrotors (Master Project 2015)
  • Jon Lund: Towards SLAM for Dynamic Vision Sensors (Master Thesis 2015)
  • Micha Brunner: Flying Motion Capture System (Semester Thesis 2015)
  • Igor Bozic: High-Frequency Position Control of the KUKA youBot Arm (Master Project 2015)
  • Joachim Ott: Vision-Based Surface Classification for Micro Aerial Vehicles (Semester Thesis 2015)
  • David Tedaldi: Feature Tracking based on Frames and Events (Semester Thesis 2015) [ PDF ]
  • Nathan Baumli: Towards Evasive Maneuvers for Quadrotors using Stereo Dynamic Vision (Master Thesis 2015) [ PDF ]
  • Amos Zweig: Event-based Depth Estimation (Semester Thesis 2014)
  • Nathan Baumli: Event-Based Full-Frame Visualization (Semester Thesis 2014) [ PDF ]
  • Basil Huber: High-Speed Pose Estimation using a Dynamic Vision Sensor (Master Thesis 2014, Winner of the Fritz Kutter Award 2014) [ PDF ] [ Video ]
  • Karl Schwabe: A Monocular Pose Estimation System based on Infrared LEDs (Master Thesis 2013) [ PDF ] [ Video ] [ Code ]
  • Benjamin Keiser: Torque Control of a KUKA youBot Arm (Master Thesis 2013, Winner of the KUKA Best Student Project 2013) [ PDF ] [ Video ] [ Code ]


Previous Projects

Visual Mapping of Unknown Space Targets for Relative Navigation and Inspection (Master Thesis)

During my Master thesis at the Computer Science and Artificial Intelligence Lab (CSAIL) at MIT under the supervision of Professor John Leonard, I implemented a visual mapping algorithm that is capable of creating a 3D model of an unknown and uncooperative space target, e.g. a satellite, using a stereo camera. The code was tested aboard the International Space Station (ISS). This algorithm will later be used for relative navigation, inspection, and docking maneuvers in space.
ISAIRAS12_Tweddle

B. Tweddle, E. Müggler, A. Saenz-Otero, D. Miller

The SPHERES VERTIGO goggles: vision based mapping and localization onboard the International Space Station

International Symposium on Artificial Intelligence, Robotics and Automation in Space (i-SAIRAS), Turin, 2012.

PDF YouTube



Robotic calligraphy - A robot that learns how to write Chinese calligraphy (Semester Thesis)

I wrote my Semester thesis at the Institute of Dynamic Systems and Control at the ETH Zurich about robotic calligraphy. During this project, I implemented a trajectory generator that enabled the robot to draw Chinese characters, used computer vision algorithms to compare the drawn characters with a reference from a textbook, and applied an iterative learning controller to improve the robot's next drawing.

IROS12_Huebel

N. Huebel, E. Mueggler, M. Waibel, R. D'Andrea

Towards Robotic Calligraphy

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura, 2012.

PDF YouTube