Publications

Detailed Information

Vision-Based Autonomous Navigation for Micro Aerial Vehicles : 소형 무인비행체의 영상기반 자동항법

DC Field Value Language
dc.contributor.advisor김현진-
dc.contributor.author임현-
dc.date.accessioned2017-07-13T06:19:57Z-
dc.date.available2017-07-13T06:19:57Z-
dc.date.issued2015-02-
dc.identifier.other000000026613-
dc.identifier.urihttps://hdl.handle.net/10371/118457-
dc.description학위논문 (박사)-- 서울대학교 대학원 : 기계항공공학부, 2015. 2. 김현진.-
dc.description.abstractIn this thesis, an extensible framework for visual navigation of micro aerial vehicles (MAVs) is presented. Throughout the thesis, design of a scalable visual navigation system and its application for MAVs will be discussed. The compositions presented in this thesis forms a MAV visual navigation system, allowing fully autonomous control without existing positioning sensors, such as GPS or motion capture system.
Contributions of this thesis can be summarized into three parts.

First, the problem of a monocular camera localization using input video, which is called image-based localization, is addressed. A prior knowledge of a 3D scene model is exploited, that is constructed offline by structure from motion (SfM) technique. From live input video, the proposed method continuously computes 6-DoF camera pose by efficiently tracking natural features and matching them to 3D points reconstructed by SfM.

Secondly, visual simultaneous localization and mapping (SLAM), a method which a camera is localized within unknown scene while building a map, is considered. The proposed method is applied to three types of visual sensors: monocular, stereo and RGB-D camera. The proposed method continuously computes the current 6-DoF camera pose and 3D landmarks position from input video. The proposed method successfully builds consistent map from indoor and outdoor sequences using a camera as the only sensor. A large-scale loop closing is demonstrated based on relative metric-topological representation of pose graph, which effectively adjusts drift of poses. The proposed method is running onboard computer in real-time and is used to control a quadrotor MAV position without external positioning sensor.

Thirdly, an optical flow-based velocity estimation method is proposed. This method utilizes optical flow to estimate translational velocity of a MAV to control translation motion. An autonomous hovering flight control of a MAV using an optical flow sensor, is implemented on an low-cost microprocessor without external positioning sensors. Experimental results from flight tests are validated with the ground-truth data provided by a high-accuracy motion capture system. I believe this work brings MAVs a step closer to autonomous operations in many useful areas, such as indoors, where the global positioning method is not available.
-
dc.description.tableofcontentsAbstract i
Acknowledgements iii
Chapter 1 Introduction 1
1.1 TowardsVisualNavigationofMAVs ................ 3
1.1.1 Image-basedLocalization .................. 4
1.1.2 VisualSLAM......................... 5
1.1.3 OpticalFlow-basedNavigation ............... 5
1.2 Overview ............................... 6
1.3 Contributions............................. 8
Chapter 2 Related Work 13
2.1 Image-basedLocalization ...................... 13
2.2 Simultaneous Localization and Mapping (SLAM) . . . . . . . . . 15
2.2.1 LoopDetectionandClosing................. 17
2.2.2 KeyframeSelection...................... 18
2.2.3 SceneRepresentation..................... 18
2.3 OpticalFlow ............................. 19
Chapter 3 Image-based Localization 21
3.1 KeyElementsoftheProposedApproach. . . . . . . . . . . . . . 23
3.1.1 SceneRepresentation..................... 24
3.1.2 Multi-scaleFeatures ..................... 25
3.1.3 PlaceRecognition ...................... 26
3.1.4 GlobalMatching ....................... 27
3.1.5 GuidedMatching....................... 28 3.1.6 OfflinePreprocessing..................... 28
3.2 Real-timeLocalization........................ 29
3.2.1 KeypointTracking ...................... 29
3.2.2 Distributing Matching Computation . . . . . . . . . . . . 30
3.2.3 PoseEstimationandFiltering................ 30
3.3 Application:SemanticLocalization................. 32
3.4 ExperimentalResults......................... 33
3.4.1 ExperimentalSetup ..................... 34
3.4.2 ExperimentsonVicon-LabSequence . . . . . . . . . . . . 37
3.4.3 ExperimentsonIndoorDatasets .............. 38
3.4.4 ExperimentsonOutdoorDatasets . . . . . . . . . . . . . 39
3.4.5 Timings............................ 40
3.4.6 FailureCases ......................... 41
3.5 Discussion............................... 41
Chapter 4 Visual SLAM 49
4.1 Introduction.............................. 49
4.2 ProblemDescription ......................... 51
4.3 ContributionsOverview ....................... 52
4.4 ProblemFormulation......................... 54
4.4.1 Metric-topological Map Representation . . . . . . . . . . 55
4.4.2 Graph Representation of Topological Map . . . . . . . . . 56
4.4.3 Metric Embedding of Keyframes and Landmarks . . . . . 57
4.4.4 OptimizationofPoseGraph................. 58
4.5 ProposedMethod........................... 59
4.5.1 Fisher Information Matrix for Uncertainty Measure . . . 59
4.5.2 KeyframeSelectionScheme ................. 61
4.5.3 Multi-levelLoopClosing................... 63
4.6 VisualSLAMSystem......................... 64
4.6.1 MonocularVisualSLAMPipeline . . . . . . . . . . . . . 64
4.6.2 RGB-DandStereoVisualSLAMPipeline . . . . . . . . . 67
4.6.3 KeypointExtractionandTracking . . . . . . . . . . . . . 68
4.6.4 PoseEstimation ....................... 69
4.6.5 LoopClosing ......................... 70
4.7 Experimental Result: Monocular Visual SLAM . . . . . . . . . . 71
4.7.1 DatasetsDescription..................... 71
4.7.2 ComparisonwithGroundTruthData . . . . . . . . . . . 72
4.8 ExperimentalResult:RGB-DSLAM................ 75
4.8.1 BenchmarkingResult .................... 75
4.8.2 Keyframe Selection Scheme Comparison . . . . . . . . . . 80
4.9 Discussion............................... 80
Chapter 5 Vision-based Control of MAVs 87
5.1 ExperimentalSystem......................... 87
5.2 ExperimentalResults......................... 89
Chapter 6 Optical Flow-based Visual Navigation 95
6.1 OpticalFlowFormulation ...................... 96
6.1.1 General Optical Flow Model and Feasibility Analysis . . . 96
6.1.2 OpticalFlowonPlanarSurface............... 98
6.1.3 Subtraction of Rotational Component of Optical Flow . . 99
6.1.4 AltitudeEstimation .....................100
6.1.5 ConversiontoMetricUnit..................101
6.2 ControllerDesign...........................101
6.2.1 DynamicModelofaQuadrotor...............102
6.2.2 Controller...........................102
6.3 ExperimentalSetup .........................103
6.3.1 AQuadrotorPlatform....................104
6.3.2 OpticalFlowSensor .....................104
6.3.3 FlightControlHardware...................105
6.4 ExperimentalResults.........................105
6.4.1 ExperimentSetup ......................106
6.4.2 ExperimentEnvironment ..................106
6.4.3 Ground-truthEvaluation ..................106
6.5 Discussion...............................107
Chapter 7 Conclusion 111
Chapter A Publications 113
A.1 Journals................................113
A.2 Conferences..............................114
A.3 Workshops ..............................115
Chapter B Multimedia Extensions
초록 (Abstract in Korean)
-
dc.formatapplication/pdf-
dc.format.extent86556115 bytes-
dc.format.mediumapplication/pdf-
dc.language.isoen-
dc.publisher서울대학교 대학원-
dc.subjectVisual Navigation-
dc.subjectMicro Aerial Vehicles-
dc.subjectOnline 3D reconstruction-
dc.subjectStructure from Motion-
dc.subjectImage-based Localization-
dc.subjectSimultaneous Localization and Mapping-
dc.subjectVision-based Control-
dc.subject.ddc621-
dc.titleVision-Based Autonomous Navigation for Micro Aerial Vehicles-
dc.title.alternative소형 무인비행체의 영상기반 자동항법-
dc.typeThesis-
dc.contributor.AlternativeAuthorHyon Lim-
dc.description.degreeDoctor-
dc.citation.pages143-
dc.contributor.affiliation공과대학 기계항공공학부-
dc.date.awarded2015-02-
Appears in Collections:
Files in This Item:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share