Agile Drone Flight through Narrow Gaps

with Onboard Sensing and Computing




In this work, we address one of the main challenges towards autonomous drone flight in complex environments, which is flight through narrow gaps. Indeed, one day micro drones will be used to search and rescue people in the aftermath of an earthquake. In these situations, collapsed buildings cannot be accessed through conventional windows, so that small gaps may be the only way to get inside. What makes this problem challenging is that a gap can be very small, such that precise trajectory-following is required, and can have arbitrary orientations, such that the quadrotor cannot fly through it in near-hover conditions. This makes it necessary to execute an agile trajectory (i.e., with high velocity and angular accelerations) in order to align the vehicle to the gap orientation.

Previous works on aggressive flight through narrow gaps have focused solely on the control and planning problem and therefore used motion-capture systems for state estimation and external computing. Conversely, we focus on using only onboard sensors and computing. More specifically, we address the case where state estimation is done via gap detection through a single, forward-facing camera and show that this raises an interesting problem of coupled perception and planning: for the robot to localize with respect to the gap, a trajectory should be selected, which guarantees that the quadrotor always faces the gap (perception constraint) and should be replanned multiple times during its execution to cope with the varying uncertainty of the state estimate. Furthermore, during the traverse, the quadrotor should maximize the distance from the edges of the gap (geometric constraint) to avoid collisions and, at the same time, it should be able to do so without relying on any visual feedback (when the robot is very close to the gap, this exits from the camera field of view). Finally, the trajectory should be feasible with respect to the dynamic constraints of the vehicle. Our proposed trajectory generation approach is independent of the gap-detection algorithm being used; thus, to simplify the perception task, we used a gap with a simple black-and-white rectangular pattern.

We successfully evaluated our approach with gap orientations of up to 45 degrees vertically and up to 30 horizontally. Our vehicle weighs 830 grams and has a thrust-to-weight ratio of 2.5. Our trajectory generation formulation handles trajectories up to 90-degree gap orientations although the quadrotor used in these experiments is too heavy and the motors saturate for more than 45-degree gap orientations. The vehicle reaches speeds of up to 3 meters per second and angular velocities of up to 400 degrees per second, with accelerations of up to 1.5 g. We can pass through gaps 1.5 times the size of the quadrotor, with only 10 centimeters of tolerance. Our method does not require any prior knowledge about the position and the orientation of the gap. No external infrastructure, such as a motion-capture system, is needed. This is the first time that such an aggressive maneuver through narrow gaps has been done by fusing gap detection from a single onboard camera and IMU.

References

Arxiv16_Falanga

D. Falanga, E. Mueggler, M. Faessler, D. Scaramuzza

Aggressive Quadrotor Flight through Narrow Gaps with Onboard Sensing and Computing

Under review.

PDF Arxiv YouTube





Automatic Re-Initialization and Failure Recovery




High-resolution photos can be found here.


With drones becoming more and more popular, safety is a big concern. A critical situation occurs when a drone temporarily loses its GPS position information, which might lead it to crash. This can happen, for instance, when flying close to buildings where GPS signal is lost. In such situations, it is desirable that the drone can rely on fall-back systems and regain stable flight as soon as possible.

We developped a new technology to automatically recover and stabilize a quadrotor from any initial condition. On the one hand, this new technology can allow a quadrotor to be launched by simply tossing it in the air, like a "baseball ball". On the other hand, it allows a quadrotor to recover back into stable flight after a system failure. Since this technology does not rely on any external infrastructure, such as GPS, it enables the safe use of drones in both indoors and outdoors environments. Thus, our new technology can become relevant for commercial use of drones, such as parcel delivery.

Our quadrotor is equipped with a single camera, an inertial measurement unit, and a distance sensor (Teraranger One). The stabilization system of the quadrotor emulates the visual system and the sense of balance within humans. As soon as a toss or a failure situation is detected, our computer-vision software analyses the images looking for distinctive landmarks in the environment, which it uses to restore balance.

All the image processing and control runs on a smartphone processor onboard the drone. The onboard sensing and computation renders the drone safe and able to fly unaided. This allows the drone to fulfil its mission without any communication or interaction with the operator.

The recovery procedure consists of multiple stages, in which the quadrotor, first, stabilizes its attitude and altitude, then, re-initializes its visual state-estimation pipeline before stabilizing fully autonomously. To experimentally demonstrate the performance of our system, in the video we aggressively throw the quadrotor in the air by hand and have it recover and stabilize all by itself. We chose this example as it simulates conditions similar to failure recovery during aggressive flight. Our system was able to recover successfully in several hundred throws in both indoor and outdoor environments.


References

M. Faessler, F. Fontana, C. Forster, D. Scaramuzza

Automatic Re-Initialization and Failure Recovery for Aggressive Flight with a Monocular Vision-Based Quadrotor

IEEE International Conference on Robotics and Automation (ICRA), Seattle, 2015.

PDF YouTube


M. Faessler, F. Fontana, C. Forster, E. Mueggler, M. Pizzoli, D. Scaramuzza

Autonomous, Vision-based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle

Journal of Field Robotics, 2015.

PDF YouTube1 YouTube2 YouTube3 YouTube4 Software