S-Space College of Engineering/Engineering Practice School (공과대학/대학원) Dept. of Mechanical Aerospace Engineering (기계항공공학부) Theses (Ph.D. / Sc.D._기계항공공학부)
Vision-Based Autonomous Navigation for Micro Aerial Vehicles : 소형 무인비행체의 영상기반 자동항법
Cited 0 time in Web of Science Cited 0 time in Scopus
- 공과대학 기계항공공학부
- Issue Date
- 서울대학교 대학원
- Visual Navigation ; Micro Aerial Vehicles ; Online 3D reconstruction ; Structure from Motion ; Image-based Localization ; Simultaneous Localization and Mapping ; Vision-based Control
- 학위논문 (박사)-- 서울대학교 대학원 : 기계항공공학부, 2015. 2. 김현진.
- In this thesis, an extensible framework for visual navigation of micro aerial vehicles (MAVs) is presented. Throughout the thesis, design of a scalable visual navigation system and its application for MAVs will be discussed. The compositions presented in this thesis forms a MAV visual navigation system, allowing fully autonomous control without existing positioning sensors, such as GPS or motion capture system.
Contributions of this thesis can be summarized into three parts.
First, the problem of a monocular camera localization using input video, which is called image-based localization, is addressed. A prior knowledge of a 3D scene model is exploited, that is constructed offline by structure from motion (SfM) technique. From live input video, the proposed method continuously computes 6-DoF camera pose by efficiently tracking natural features and matching them to 3D points reconstructed by SfM.
Secondly, visual simultaneous localization and mapping (SLAM), a method which a camera is localized within unknown scene while building a map, is considered. The proposed method is applied to three types of visual sensors: monocular, stereo and RGB-D camera. The proposed method continuously computes the current 6-DoF camera pose and 3D landmarks position from input video. The proposed method successfully builds consistent map from indoor and outdoor sequences using a camera as the only sensor. A large-scale loop closing is demonstrated based on relative metric-topological representation of pose graph, which effectively adjusts drift of poses. The proposed method is running onboard computer in real-time and is used to control a quadrotor MAV position without external positioning sensor.
Thirdly, an optical flow-based velocity estimation method is proposed. This method utilizes optical flow to estimate translational velocity of a MAV to control translation motion. An autonomous hovering flight control of a MAV using an optical flow sensor, is implemented on an low-cost microprocessor without external positioning sensors. Experimental results from flight tests are validated with the ground-truth data provided by a high-accuracy motion capture system. I believe this work brings MAVs a step closer to autonomous operations in many useful areas, such as indoors, where the global positioning method is not available.
- Files in This Item:
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.