Student Projects


How to apply

To apply, please send your CV, your Ms and Bs transcripts by email to all the contacts indicated below the project description. Do not apply on SiROP . Since Prof. Davide Scaramuzza is affiliated with ETH, there is no organizational overhead for ETH students. Custom projects are occasionally available. If you would like to do a project with us but could not find an advertized project that suits you, please contact Prof. Davide Scaramuzza directly to ask for a tailored project (sdavide at ifi.uzh.ch).


Upon successful completion of a project in our lab, students may also have the opportunity to get an internship at one of our numerous industrial and academic partners worldwide (e.g., NASA/JPL, University of Pennsylvania, UCLA, MIT, Stanford, ...).



Flight Trajectory Modeling for Human-Piloted Drone Racing - Available

Description: In drone racing, human pilots navigate quadrotor drones as quickly as possible through a sequence of gates arranged in a 3D track. There is a large number of possible trajectories for linking gates among which pilots have to choose. This project aims to identify the most common and most efficient flight trajectories used by human pilots. The student will collect flight trajectory data for various tracks from human pilots using a drone racing simulator. The student will analyze 3D trajectories and motion kinematics to identify trajectories that achieve the fastest lap times most consistently. Finally, the student will compare flight trajectories from human pilots to trajectories from a minimum-time planner used for autonomous navigation. Requirements: Experience in computer vision and machine learning; ability to code in Python/Matlab, Linux, C++, ROS; experience in 3D kinematic analysis is a plus.

Goal: Extend knowledge about flight trajectory planning in human-piloted drone racing.

Contact Details: Please send your CV and transcripts (bachelor and master) to: Christian Pfeiffer (cpfeiffe (at) ifi (dot) uzh (dot) ch), Elia Kaufmann (ekaufmann (at) ifi (dot) uzh (dot) ch)

Thesis Type: Semester Project / Master Thesis

See project on SiROP

Develop a Research-Grade Flight Simulator for Human-Piloted Drone Racing - Available

Description: The goal of this project is to develop a simulator for research on human-piloted drone racing. The student will integrate an existing drone racing simulator with custom software packages (ROS or Python) for logging drone state in 3D (i.e., position, rotation, velocity, acceleration), camera images, and control commands (i.e., thrust, yaw, pitch, roll). The student will create a custom GUI for changing quadrotor settings (i.e., weight, motor thrust, camera angle, rate profiles) and race track layouts (i.e. size, position, and static-vs-moving type, number of gates, track type and illumination). Finally, the features and performance of the integrated simulator will be compared to existing commercial and research-grade simulators. Requirements: Strong programming skills in Python, C++, C#, experience with Linux, ROS. Experience in Unity3D or Unreal Engine is a plus.

Goal: The developed software package will be used for human-subjects research on first-person-view drone racing.

Contact Details: Please send your CV and transcripts (bachelor and master) to: Christian Pfeiffer (cpfeiffe (at) ifi (dot) uzh (dot) ch), Yunlong Song (song (at) ifi (dot) uzh (dot) ch)

Thesis Type: Semester Project

See project on SiROP

Visual Processing and Control in Human-Piloted Drone Racing - Available

Description: In drone racing, human pilots use visual information from a drone-mounted camera for selecting control commands they send to the drone via a remote controller. It is currently unknown how humans process visual information during fast and agile drone flight and how visual processing affects their choice of control commands. To answer these questions, this project will collect eye-tracking and control-command data from human pilots using a drone racing simulator. The student will use statistical modeling and machine learning to investigate the relationship between eye movements, control commands, and drone state. Requirements: Background in computer vision and machine learning, solid programming experience in Python; experience in eye-tracking and human subjects research is a plus (not mandatory).

Goal: Extend knowledge about visual processing and control in human-piloted drone racing.

Contact Details: Please send your CV and transcripts (bachelor and master) to: Christian Pfeiffer (cpfeiffe (at) ifi (dot) uzh (dot) ch), Antonio Loquercio (loquercio (at) ifi (dot) uzh (dot) ch)

Thesis Type: Semester Project / Master Thesis

See project on SiROP

3rd Person View Imitation Learning - Available

Description: Manually programming robots to carry out specific tasks is a difficult and time consuming process. A possible solution to this problem is to use _imitation learning_, in which a robot aims to imitate a teacher, e.g., a human, that knows how to perform the task. Usually, the teacher and the learner share the same point of view on the problem. However, this last assumption might not be necessary. As humans, for example, we learn to cook by looking at others cooking. During this project, we will explore the possibility of repeating such a kind of 3rd person view _imitation learning_ with flying robots on a navigation task.

Goal: The project aims to develop machine learning based techniques that will enable a drone to learn flying by looking at an other robot flying.

Contact Details: **Antonio Loquercio**: loquercio@ifi.uzh.ch

Thesis Type: Semester Project / Bachelor Thesis / Master Thesis

See project on SiROP

Safe Simulation to Real World Transfer - Available

Description: Recent techniques based on machine learning enabled robotics system to perform many difficult tasks, such as manipulation or navigation. Those techniques are usually very data-intensive, and require simulators to generate enough training data. In this project, we will develop techniques to formalize the notion of simulation to reality transfer in a robotics setting. In particular, we are interested in finding performance guarantees to bound the performance drop usually encountered when deploying a policy trained in simulation on a physical platform.

Goal: The project aims to develop techniques based on machine learning to have maximal knowledge transfer between simulated and real world on a general robotic task.

Contact Details: **Antonio Loquercio** loquercio@ifi.uzh.ch

Thesis Type: Semester Project / Bachelor Thesis / Master Thesis

See project on SiROP

Safe Unsupervised Learning for Drone Perception and Control - Available

Description: Supervised learning is the gold standard algorithm to solve computer vision tasks like classification, detection or segmentation. However, for several interesting tasks (e.g. control of a drone, etc.) collecting a large annotated datasets is a very tedious and costly process. In this project, we aim to build a system to let a drone learn how to fly aggressively in a complex environment by letting the drone interact with its surroundings in a safe way. **Requirements**: Computer vision knowledge; programming experience with python. Machine learning knowledge is a plus but it is not required.

Goal: The goal of this project consists of building a system which can control a drone in complex environments.

Contact Details: Antonio Loquercio, _loquercio@ifi.uzh.ch_

Thesis Type: Semester Project / Master Thesis

See project on SiROP

Target following on nano-scale UAV - Available

Description: Autonomous Unmanned Aerial Vehicles (UAVs) have numerous applications due to their agility and flexibility. However, navigation algorithms are computationally demanding, and it is challenging to run them on-board of nano-scale UAVs (i.e., few centimeters of diameter). This project focuses on the object tracking, (i.e., target following) on such nano-UAVs. To do this, we will first train a Convolutional Neural Network (CNN) with data collected in simulation, and then run the aforementioned network on a parallel ultra-low-power (PULP) processor, enabling flight with on-board sensing and computing only. **Requirements**: Knowledge of python, cpp and embedded programming. Machine learning knowledge is a plus but it is not strictly required.

Contact Details: Antonio Loquercio, _loquercio@ifi.uzh.ch_

Thesis Type: Semester Project / Master Thesis

See project on SiROP

Video Reconstruction from Events - Available

Description: Event cameras have a number of advantages over standard frame-based cameras. Two of them are the high-dynamic range and high temporal resolution. In previous work, we have successfully reconstructed images from a stream of events (see: https://youtu.be/eomALySSGVU). Now, we want to take a step further and improve this pipeline to improve the overall quality of the reconstruction. Applications range from computational photography/videography to calibration of event cameras. This project requires previous experience in machine learning as well at least one course in computer vision. During the project, you will have the opportunities to design novel deep learning architectures tailored to event-based vision and image reconstruction. Contact us for more details.

Goal: The goal of this project is to extract high-dynamic range, high-frame rate video from a stream of events.

Contact Details: Mathias Gehrig (mgehrig at ifi.uzh.ch)

Thesis Type: Master Thesis

See project on SiROP

Computational Photography and Videography - Available

Description: Computational Photography is a hot topic in computer vision because it finds widespread applications in mobile devices. Traditionally, the problem has been studied using frames from a single camera. Today, mobile devices feature multiple cameras and sensors that can be combined to push the frontier in computational photography and videography. In previous work (https://youtu.be/eomALySSGVU), we have successfully reconstructed high-speed, HDR video from events. In this project, we aim for combining information from a standard and event camera to exploit their complementary nature. Applications range from high-speed, HDR video to deblurring and beyond. Contact us for more details.

Contact Details: Mathias Gehrig (mgehrig at ifi.uzh.ch); Daniel Gehrig (dgehrig at ifi.uzh.ch)

Thesis Type: Master Thesis

See project on SiROP

Optimization for Spiking Neural Networks - Available

Description: Spiking neural networks (SNNs) are closely inspired by the extremely efficient computation of brains. Unlike artificial neural networks, it processes information using accurate timing of events/spikes. Together with event-cameras, SNNs show promise to both lower latency and computational burden compared to artificial neural networks. In recent years, researchers have proposed several methods to estimate gradients of SNN parameters in a supervised learning context. In practice, many of these approaches rely on assumptions that lead to unknown consequences in the learning process. Requirements: - Background in machine learning; especially deep learning - Good programming skills; experience in CUDA is a plus.

Goal: In this project we aim to establish a principled framework for gradient-based optimization for spiking neural networks. As a first step, we evaluate recently proposed methods on real-world relevant tasks. Next, we extend previous work to take into previously ignored properties of spiking networks. Finally, the new approach will be compared to previous methods for validation. If progress allows, we will apply this approach to robotics and computer vision problems to demonstrate real-world applicability.

Contact Details: Mathias Gehrig, mgehrig (at) ifi (dot) uzh (dot) ch; Daniel Gehrig, dgehrig (at) ifi (dot) uzh (dot) ch

Thesis Type: Semester Project / Master Thesis

See project on SiROP

Perception Aware Model Predictive Control for Autonomous Power Line Tracking - Available

Description: Classical power line inspection and maintenance are dangerous, costly and time consuming. Drones could mitigate the risk for humans and minimize the cost for the direct benefit of the power line infrastructure. Coupling perception and path planning with control has become increasingly popular in aerial vehicles. This project will continue to investigate vision-based navigation tightly couple approaches of perception and control to satisfy the drone dynamics and compute feasible trajectories with respect to the input saturation. This involves to further development the research on perception aware Model Predictive Control (MPC) for quadrotors, solving the challenging aspects of a power line inspection scenario. A perception aware MPC approach would ideally improve the aerial vehicle's behavior during an approaching maneuver.

Goal: Pursue research on a unified control and planning approach that integrates the action and perception objectives to satisfy this functionality. The functionality will accomplish to the challenging environmental conditions and the morphology of the target which imposes several points of interest for the optimizer.

Contact Details: Javier Hidalgo-Carrió (jhidalgocarrio@ifi.uzh.ch) and Yunlong Song (song@ifi.uzh.ch)

Thesis Type: Bachelor Thesis / Master Thesis

See project on SiROP

Power-Line Dataset for Autonomous Drone Inspection - Available

Description: Classical power line inspection and maintenance are dangerous, costly and time consuming. Drones could mitigate the risk for humans and minimize the cost for direct benefit of the infrastructure. Several sensing capabilities has been already tested (i.e. RGB, LiDAR) which gives the drone the abilities to operate in unstructured environments. Sensor fusion is a popular technique to get the best of each sensor for autonomous navigation. Benchmark of perception strategies is a key part for solid and robust algorithm development before final deployment on the system. However, the lack of relevant and accurate data for multiple sensors makes difficult the Verification and Validation (V&V) process of perception algorithms. The goals of this project is to deliver the first multi sensor power-line inspection dataset for drones with alternative sensory data and ground truth. Requirements: Background in robotics and autonomous systems – Drone navigation preferable – Excellent programming in C++ and Python – Knowledge of ROS and robotic middle-ware - Passionate about robotics and engineering in general. - Linux

Goal: Release an open-access dataset for the evaluation of perception pipelines for autonomous drones. The goal is to establish a solid benchmark for autonomous drone inspection of power-lines . The following sensors are consider to be part of the dataset: - Absolute depth information - RGB images - Event-based camera - Thermography - Inertial Sensory information - Ground truth positioning

Contact Details: Javier Hidalgo-Carrió (jhidalgocarrio@ifi.uzh.ch)

Thesis Type: Bachelor Thesis / Master Thesis

See project on SiROP

nanoSVO: Visual Inertial Odometry on a NanoPi - Available

Description: Classical VIO pipelines use geometric information to infer the ego-motion of the camera and couple this information with measurements from the IMU. VIO are established technologies used nowadays in many embedded application like Space exploration, drones, VR/AR goggles, etc.. Therefore, the future is to compute the VIO pipeline in a dedicated embedded computer in order to liberate the main computer for high level perception and complex tasks. This project is about solving the challenging tasks of compiling, deploying and testing SVO algorithm in a NanoPi computer integrated on a quadrotor.

Goal: Generated a cross compilation pipeline for cutting edge visual-inertial odometry (SVO) with real field testing in drones. The target platform in an ARM Cortex AX Quad-Core family of processors. Continuous Integration (CI) of the cross compilation pipeline is also desired.

Contact Details: Javier Hidalgo-Carrió (jhidalgocarrio@ifi.uzh.ch) and Thomas Laengle (tlaengle@ifi.uzh.ch)

Thesis Type: Semester Project / Bachelor Thesis

See project on SiROP

Data-Driven Quadrotor Simulation - Available

Description: Current quadrotor simulators used in research only model the simplified dynamics of quadrotors and typically don’t account for aerodynamic effects encountered at high speeds. To push the limits in fast autonomous flight, access to a simulator that accurately models the platform in those regimes is very important. With having access to the largest optitrack-space in Zurich, the goal of this thesis is to record a dataset of very agile maneuvers and use it to accurately identify the quadrotor platform. After analyzing different identification methods, the best performing candidate is used to build an accurate quadrotor dynamics model that is then integrated in a simulator. Applicants should have strong experience in C++ and Python programming and a robotics background.

Goal: Collect a dataset of high-speed maneuvers in the optitrack and identify the quadrotor platform. Use this model to create a fast and accurate quadrotor simulation. Verify the simulator by comparing it to real-world data.

Contact Details: Elia Kaufmann (ekaufmann@ifi.uzh.ch), Yunlong Song (song@ifi.uzh.ch)

Thesis Type: Semester Project / Master Thesis

See project on SiROP

Reinforcement learning for 3D surgery planning of Femoral Head Reduction Osteotomy (in collaboration with Balgrist hospital) - Available

Description: Morbus Legg-Calvé-Perthes is a paediatric disorder of the lower extremities, causing deformities of the femoral head. Surgical treatment for this bone deformity can be achieved by a procedure known as femoral head reduction osteotomy (FHRO), which involves the resection of a wedge from the femoral head to restore the function of the joint. The preoperative planning of this procedure is a complex three-dimensional (3D) optimization problems involving more than 20 degrees of freedom (DoF) as it comprises the calculation of the surgical cuts and the reposition of the resected fragment to the desired anatomical position. This process is currently done manually in collaboration between engineers and surgeons.

Goal: In the course of this master thesis, you will help us to improve our current surgery planning methods by developing an approach to predict the reposition of the fragment and the pose of the cutting planes defining the bone wedge. The objective of this master thesis is to apply deep (reinforcement) learning techniques to automatically find an optimal solution for the preoperative planning of FHRO. We will start by solving a simplified version of the optimization problem, with a reduced DoF involving only the calculation of the bone fragment reposition and we will gradually increase the DoF and the complexity of the task. This project is part of a bigger framework, which is currently under development in our clinic for optimal surgical outcomes. (The student will mainly work at the Balgrist CAMPUS) **Requirements:** Hands-on experience in reinforcement learning, deep learning. Strong coding skills in Python. Experience in mathematical optimization and spatial transformation is a plus.

Contact Details: Yunlong Song (song@ifi.uzh.ch), Ackermann Joelle (joelle.ackermann@balgrist.ch) Prof. Philipp Fuernstahl (philipp.fuernstahl@balgrist.ch)

Thesis Type: Master Thesis

See project on SiROP

Learning features for efficient deep reinforcement learning - Available

Description: The study of end-to-end deep learning in computer vision has mainly focused on developing useful object representations for image classification, object detection, or semantic segmentation. Recent work has shown that it is possible to learn temporally and geometrically aligned keypoints given only videos, and the object keypoints learned via unsupervised learning manners can be useful for efficient control and reinforcement learning.

Goal: The goal of this project is to find out if it is possible to learn useful features or intermediate representation s for controlling mobile robots in high-speed. For example, can we use the Transporter (a neural network architecture) for finding useful features in an autonomous car racing environment? if so, can we use these features for discovering an optimal control policy via deep reinforcement learning? **Required skills:** Python/C++ reinforcement learning, and deep learning skills.

Contact Details: Yunlong Song (song@ifi.uzh.ch) and Titus Cieslewski ( titus at ifi.uzh.ch )

Thesis Type: Semester Project / Master Thesis

See project on SiROP

Reinforcement Learning for Drone Racing - Available

Description: In drone racing, human pilots navigate quadrotor drones as quickly as possible through a sequence of gates arranged in a 3D track. Inspired by the impressive flight performance of human pilots, the goal of this project is to train a deep sensorimotor policy that can complete a given track as fast as possible. To this end, the policy directly predicts low-level control commands from noisy odometry data. Provided with an in-house drone simulator, the student investigates state-of-the-art reinforcement learning algorithms and reward designs for the task of drone racing. The ultimate goal is to outperform human pilots on a simulated track. Applicants should have strong experience in C++ and python programming. Reinforcement learning and robotics background are required.

Goal: Find the fastest possible trajectory through a drone racing track using reinforcement learning. Investigate different reward formulations for the task of drone racing. Compare the resulting trajectory with other trajectory planning methods, e.g., model-based path planning algorithms or optimization-based algorithms.

Contact Details: Yunlong Song (song (at) ifi (dot) uzh (dot) ch), Elia Kaufmann (ekaufmann (at) ifi (dot) uzh (dot) ch)

Thesis Type: Semester Project

See project on SiROP

Embedded systems development with NVIDIA Jetson TX2 for fast drone flying - Available

Description: The TX2 is a powerful computational unit with 2 Denver 64-bit CPUs + Quad-Core A57 Complex, NVIDIA Pascal™ Architecture GPU. We use image processing and IMU data in order to deploy our machine learning algorithms in real life experiments. This makes our robots fly autonomously without any help of external communication. In a first iteration, the objective is to have a fully functional connector board including power management, USB OTG, USB 3.0, 2 UARTs, (1 Serial port) Ethernet and a CSI (camera connector). In a second iteration we are redesigning the hardware, integrating our own Obvio-board (time synchronized IMU and RGB) and our own flight controller (integration of an ARM STM32 MyC).

Goal: Test and verify the existing prototype in collaboration with our lab engineers. Create a second iteration with the integration of a microcontroller in order to integrate time synchronization with an IMU and a camera and integrate our custom built flight controller using D-Shot.Applicants should have a solid understanding of Linux device tree for embedded ARM core and some experience using pcb software (kiCAD/Eagle) as well as solid knowledge on communication protocols (UART, USB, SPI, Ethernet).

Contact Details: Manuel Sutter (Systems Engineer BSc) msutter (at) ifi (dot) uzh (dot) ch

Thesis Type: Internship / Bachelor Thesis

See project on SiROP

3D Reconstruction using an Event Camera - Available

Description: Event cameras such as the Dynamic Vision Sensor (DVS) are recent sensors with large potential for high-speed and high dynamic range robotic applications. In particular, they have been used to generate high speed video and for high speed visual odometry. In this project we want to explore the possibility using an event camera to do asynchronous 3D reconstruction with very high temporal resolution. These properties are critical in applications such as fast obstacle avoidance and fast mapping. Applicants should have a background in C++ programming and low-level vision.

Goal: The goal of this project is to explore a 3D reconstruction method with an event camera.

Contact Details: Daniel Gehrig (dgehrig (at) ifi (dot) uzh (dot) ch), Mathias Gehrig (mgehrig (at) ifi (dot) uzh (dot) ch)

Thesis Type: Collaboration / Master Thesis

See project on SiROP

Learning an Event Camera - Available

Description: Event cameras such as the Dynamic Vision Sensor (DVS) are recent sensors with a lot of potential for high-speed and high dynamic range robotic applications. They have been successfully applied in many applications, such as high speed video and high speed visual odometry. In spite of this success, the exact operating principle of event cameras, that is, how events are generated from a given visual signal and how noise is generated, is not well understood. In his work we want to explore new techniques for modelling the generation of events in an event camera, which would have wide implications for existing techniques. Applicants should have a background in C++ programming and low-level vision. In addition, familiarity with learning frameworks such as pytorch or tensorflow are required.

Goal: The goal of this project is to explore new techniques for modelling an event camera.

Contact Details: Daniel Gehrig (dgehrig (at) ifi (dot) uzh (dot) ch), Mathias Gehrig (mgehrig (at) ifi (dot) uzh (dot) ch)

Thesis Type: Semester Project / Internship / Master Thesis

See project on SiROP

Asynchronous Processing for Event-based Deep Learning - Available

Description: Event cameras such as the Dynamic Vision Sensor (DVS) are recent sensors with large potential for high-speed and high dynamic range robotic applications. Since their output is sparse traditional algorithms, which are designed for dense inputs such as frames, are not well suited. The goal of this project is explore ways to adapt existing deep learning algorithms to handle sparse asynchronous data from events. Applicants should have experience in C++ and python deep learning frameworks (tensorflow or pytorch), and have a strong background in computer vision.

Goal: The goal of this project is explore ways to adapt existing deep learning algorithms to handle sparse asynchronous data from events.

Contact Details: Daniel Gehrig (dgehrig at ifi.uzh.ch)

Thesis Type: Semester Project / Master Thesis

See project on SiROP