Publications

Detailed Information

Vision-Based Autonomous Navigation for Micro Aerial Vehicles : 소형 무인비행체의 영상기반 자동항법

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

임현

Advisor
김현진
Major
공과대학 기계항공공학부
Issue Date
2015-02
Publisher
서울대학교 대학원
Keywords
Visual NavigationMicro Aerial VehiclesOnline 3D reconstructionStructure from MotionImage-based LocalizationSimultaneous Localization and MappingVision-based Control
Description
학위논문 (박사)-- 서울대학교 대학원 : 기계항공공학부, 2015. 2. 김현진.
Abstract
In this thesis, an extensible framework for visual navigation of micro aerial vehicles (MAVs) is presented. Throughout the thesis, design of a scalable visual navigation system and its application for MAVs will be discussed. The compositions presented in this thesis forms a MAV visual navigation system, allowing fully autonomous control without existing positioning sensors, such as GPS or motion capture system.
Contributions of this thesis can be summarized into three parts.

First, the problem of a monocular camera localization using input video, which is called image-based localization, is addressed. A prior knowledge of a 3D scene model is exploited, that is constructed offline by structure from motion (SfM) technique. From live input video, the proposed method continuously computes 6-DoF camera pose by efficiently tracking natural features and matching them to 3D points reconstructed by SfM.

Secondly, visual simultaneous localization and mapping (SLAM), a method which a camera is localized within unknown scene while building a map, is considered. The proposed method is applied to three types of visual sensors: monocular, stereo and RGB-D camera. The proposed method continuously computes the current 6-DoF camera pose and 3D landmarks position from input video. The proposed method successfully builds consistent map from indoor and outdoor sequences using a camera as the only sensor. A large-scale loop closing is demonstrated based on relative metric-topological representation of pose graph, which effectively adjusts drift of poses. The proposed method is running onboard computer in real-time and is used to control a quadrotor MAV position without external positioning sensor.

Thirdly, an optical flow-based velocity estimation method is proposed. This method utilizes optical flow to estimate translational velocity of a MAV to control translation motion. An autonomous hovering flight control of a MAV using an optical flow sensor, is implemented on an low-cost microprocessor without external positioning sensors. Experimental results from flight tests are validated with the ground-truth data provided by a high-accuracy motion capture system. I believe this work brings MAVs a step closer to autonomous operations in many useful areas, such as indoors, where the global positioning method is not available.
Language
English
URI
https://hdl.handle.net/10371/118457
Files in This Item:
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share