SVO 2.0: Fast Semi-Direct Visual Odometry
for Monocular, Wide Angle, and Multi-camera Systems
Download
SVO 2.0 Binaries (x86_64/armhf) Available Here
This package contains also examples to use SVO 2.0 with monocular, stereo, stereo/monocular + IMU, pinhole/fisheye/catadioptric cameras.
GitHub repository of the example code
This example code is already included in the link above. However, if encounter technical problems, you can start an issue in this GitHub repository.
Description
SVO 2.0 (IEEE TRO'17) extends the original SVO impleemntation (ICRA' 14) with the addition of edgletes, IMU prior, wide angle cameras (fisheye and catadioptric), multi-camera configurations, and forward looking camera motion.
What is SVO? SVO uses a semi-drect paradigm to estimate the 6-DOF motion of a camera system from both pixel intensities (direct) and features (without the necessity for time-consuming feature extraction and matching procedures), while achieving better accuracy by directly using the pixel intensities.
SVO is both versatile and efficient. First, it works on different types of cameras, from common projective cameras to catadioptric ones. It also supports stereo and multiple cameras. Therefore it can be tailored for different scenarios. Second, SVO requires very little computational resource compared to most of the existing algorithms. It can reach up to 400 frames per second on an i7 processor (while taking less than 2 cores!) and up to 100 fps on a smartphone processor (e.g., Odroid XU4).
Processing time in milliseconds on a laptop with an Intel Core i7 (2.80 GHz) processor.
(For more details about the performance of SVO 2.0, please refer to the SVO 2.0 paper: IEEE TRO'17).
Due to its flexibility and efficiency, SVO has been successfully used since 2014 in a variety of applications, including state estimation for micro aerial vehicles, automotive, and virtual reality applications. In all these applications, SVO ran fully onboard! You can find a non-comprehensive list below.
Autonomous Drone Navigation
20 m/s flight on DARPA FLA drone |
Fast autonomous flight |
---|---|
Autonomous landing |
Automatic recovery after tossing drone in the air |
Selfie drone in 3D |
Aerial-ground collaboration |
Search and rescue |
On-the-spot training |
Air-ground localization |
Collaborative Visual SLAM (up to 3 drones) |
Automotive
Car with four cameras |
ZurichEye (lab spinoff) |
---|
3D Scanning
Dense reconstruction |
Flying 3D scanner |
---|
Commercial Applications
|
Inside-out tracking with SVO + IMU on an iPhone (Dacuda) |
---|---|
Room-scale VR with a smartphone (Dacuda) |
Automotive car navigation (lab spinoff ZurichEye, now Facebook-Oculus VR Zurich) |
References
SVO: Semi-Direct Visual Odometry for Monocular and Multi-Camera Systems
IEEE Transactions on Robotics, Vol. 33, Issue 2, pages 249-265, Apr. 2017.
Includes comparison against ORB-SLAM, LSD-SLAM, and DSO and comparison among Dense, Semi-dense, and Sparse Direct Image Alignment.
SVO: Fast Semi-Direct Monocular Visual Odometry
IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, 2014.