University Zurich quadcopter

quad
https://github.com/uzh-rpg/rpg_quadrotor_control/wiki/RPG-Quadrotor-Setup. Aggressive movement quad. We are releasing our full quadrotor control framework as open-source. This framework has been used for the past 6 years in my lab. It has served as the basis in over 500 public live demonstrations over 50 scientific publications from my lab:

https://twitter.com/davsca1/status/992088369680207872

http://rpg.ifi.uzh.ch/rpg_quadrotor_control.html

Tethered drone
Quad copter tethered

Navigate street autonomously
https://github.com/uzh-rpg/rpg_public_dronet and see Uav forest trail navigation

Odometry
https://github.com/uzh-rpg/rpg_svo

https://www.youtube.com/watch?v=2YnIMfw6bJY&feature=youtu.be We propose a semi-direct monocular visual odometry algorithm that is precise, robust, and faster than current state-of-the-art methods. The semi-direct approach eliminates the need of costly feature extraction and robust matching techniques for motion estimation. Our algorithm operates directly on pixel intensities, which results in subpixel precision at high frame-rates. A probabilistic mapping method that explicitly models outlier measurements is used to estimate 3D points, which results in fewer outliers and more reliable points. Precise and high frame-rate motion estimation brings increased robustness in scenes of little, repetitive, and high-frequency texture. The algorithm is applied to micro-aerial-vehicle state-estimation in GPS-denied environments and runs at 55 frames per second on the onboard embedded computer and at more than 300 frames per second on a consumer laptop. We call our approach SVO (Semi-direct Visual Odometry) and release our implementation as open-source software.

https://www.youtube.com/watch?v=fXy4P3nvxHQ Autonomous vision-based flight in the university hall at 4 m/s. In this video, the drone follows an eight-shaped and a circular trajectory, respectively. The quadrotor uses only a single down-looking camera and an IMU as the only sensors. Visual SLAM (SVO 2.0), visual-inertial sensor fusion, planning, and control run fully onboard on an smartphone computer (Odroid U3, ARM Cortex A9). SVO 2.0 takes 10 ms per frame on the Odroid. More info about our system in our papers below. M. Faessler, F. Fontana, C. Forster, D. Scaramuzza, Automatic Re-Initialization and Failure Recovery for Aggressive Flight with a Monocular Vision-Based Quadrotor, IEEE International Conference on Robotics and Automation (ICRA), Seattle, 2015.

M. Faessler, F. Fontana, C. Forster, E. Mueggler, M. Pizzoli, D. Scaramuzza Autonomous, Vision-based Flight and Live Dense 3D Mapping with a Quadrotor Micro Aerial Vehicle Journal of Field Robotics, 2015.

C. Forster, M. Pizzoli, D. Scaramuzza, SVO: Fast Semi-Direct Monocular Visual Odometry, IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, 2014.

links
Quad frames

QuadroCopter

Slam