Browse

Riemannian Distortion Measures for Non-Euclidean Data
비유클리드 데이터를 위한 리만기하학적 뒤틀림 측도

DC Field Value Language
dc.contributor.advisor박종우-
dc.contributor.author장청재-
dc.date.accessioned2019-10-21T01:50:13Z-
dc.date.available2019-10-21T01:50:13Z-
dc.date.issued2019-08-
dc.identifier.other000000157565-
dc.identifier.urihttps://hdl.handle.net/10371/161910-
dc.identifier.urihttp://dcollection.snu.ac.kr/common/orgView/000000157565ko_KR
dc.description학위논문(박사)--서울대학교 대학원 :공과대학 기계항공공학부,2019. 8. 박종우.-
dc.description.abstractA growing number of problems in machine learning involve data that is non-Euclidean. A naive application of existing learning algorithms to such data often produces results that depend on the choice of local coordinates used to parametrize the data. At the same time, many problems in machine learning eventually reduce to an optimization problem, in which the objective is to find a mapping from one curved space into another that best preserves distances and angles. We show that these and other problems can be naturally formulated as the minimization of a coordinate-invariant functional that measures the proximity to an isometry of a mapping between two Riemannian manifolds. We first show how to construct general coordinate-invariant functionals of mappings between Riemannian manifolds, and propose a family of functionals that measures how close a mapping is to being an isometry. We then formulate coordinate-invariant distortion measures for manifold learning of non-Euclidean data, and derive gradient-based optimization algorithms that accompany these measures. We also address the problem of autoencoder training for non-Euclidean data using our Riemannian geometric perspective. Both manifold learning and autoencoder case studies involving non-Euclidean datasets illustrate both the underlying geometric intuition and performance advantages of our Riemannian distortion minimization framework.-
dc.description.abstract기계학습에서 비 유클리드 데이터를 다루는 문제가 점차 늘어나고 있다. 기존의 기계학습 알고리즘을 비 유클리드 데이터에 그대로 적용하면 그 결과는 흔히 데이터를 매개변수화하는 국소 좌표계에 의존하게 된다. 이와 동시에 많은 수의 기계학습 문제들이 결과적으로 두 휘어진 공간 사이에서 거리와 각도를 가장 잘 보존하도록 하는 사상을 찾는 최적화 문제에 도달하게 된다. 우리는 이러한 문제들이 두 리만 다양체 사이의 사상이 등거리변환에서 얼마나 가까운지를 측정하는 좌표 불변 범함수의 최소화 문제로 자연스럽게 공식화될 수 있다는 것을 보인다. 먼저 두 리만 다양체 사이의 사상에 대한 일반적인 좌표 불변 범함수를 구축하는 방법을 제시하고, 사상이 등거리변환에서 얼마나 가까운지를 측정하는 범함수들의 모임을 제시한다. 그리고 비 유클리드 데이터의 다양체 학습을 위한 좌표 불변 뒤틀림 측도를 공식화하고, 이 측도에 대한 그래디언트 기반 최적화 알고리즘을 유도한다. 또한, 제시된 리만 기하학적 관점을 이용하여 비 유클리드 데이터의 오토인코더 학습 문제를 다룬다. 비 유클리드 데이터 세트를 활용한 다양체 학습과 오토인코더 사례 연구는 내재한 기하학적인 직관과 제시된 리만 뒤틀림 최소화 기법의 성능 이점을 분명히 보여준다.-
dc.description.tableofcontents1 Introduction 1
1.1 Minimizing Distortion of Mappings in Machine Learning 1
1.2 A Differential Geometric View 11
1.2.1 Example: Map-Making 11
1.2.2 Non-Euclidean Data 15
1.3 Related Works 20
1.4 Contributions of This Thesis 23
1.4.1 Riemannian Distortion 23
1.4.2 Manifold Learning for Euclidean and Non-Euclidean Data 24
1.4.3 Autoencoders for Non-Euclidean Data 25
1.5 Organization 26
2 Riemannian Distortion 27
2.1 Mathematical Background 28
2.1.1 Calculus on the Sphere 28
2.1.2 Calculus on Riemannian Manifolds 31
2.1.3 Isometry 34
2.2 Constructing Coordinate-Invariant Functionals on Riemannian Manifolds 36
2.3 Riemannian Distortion and Isometry 38
2.4 Riemannian Distortion and Machine Learning 45
3 Manifold Learning for Euclidean and Non-Euclidean Data 47
3.1 Kernel-Based Estimation of JG^{−1}J^T 48
3.2 Gradient-Based Optimization 54
3.3 Harmonic Mapping Distortion 56
3.4 A Taxonomy of Manifold Learning Algorithms 62
3.4.1 Locally Linear Embedding 62
3.4.2 Laplacian Eigenmap 64
3.4.3 Diffusion Map 65
3.4.4 Riemannian Relaxation 67
3.4.5 A Taxonomy of Manifold Learning Algorithms 68
4 Manifold Learning Case Studies 73
4.1 Case Studies for Euclidean Data 73
4.1.1 Swiss Roll and Quarter Sphere 73
4.1.2 A Detailed Comparison Between LLE, LE, DM, and HM 82
4.1.3 Face Dataset 86
4.2 Case Studies for Non-Euclidean Data 89
4.2.1 Synthetic P(2) Data 89
4.2.2 Human Mass-Inertia Data 102
5 Autoencoders for Non-Euclidean Data 111
5.1 Autoencoders for Non-Euclidean Data 113
5.2 Case Study: Hyperspherical Data 115
5.2.1 Coordinate Invariance of Geometric Denoising Autoencoder 116
5.2.2 Estimation of the Derivative of the Log-Probability Function 120
5.3 Case Study: Diffusion Tensor Imaging (DTI) Data 124
5.3.1 DTI as Non-Euclidean Data 124
5.3.2 DTI Filtering Algorithms 127
5.3.3 DTI Filtering Experiments 134
6 Conclusion 139
A Appendix 141
A.1 Approximation of the Laplace-Beltrami Operator on a Submanifold Embedded in a Riemannian Ambient Space 141
A.2 Proof of Proposition 3.2 149
A.3 Approximations for the Laplacian Eigenmap and Diffusion Map Methods 152
A.3.1 Proof of Proposition 3.3 152
A.3.2 Proof of Proposition 3.4 154
A.4 Proof of Theorem 5.1 158
A.4.1 First-order Necessary Conditions for GRCAE 158
A.4.2 First-order Necessary Conditions for GDAE 160
A.5 Tangent Space Gaussians on S^n 162
A.6 Closed-Form Formulas of Matrix Exponential, Logarithm, and Their Jacobians for Symmetric Matrices 164
Bibliography 166
Abstract 179
-
dc.language.isoeng-
dc.publisher서울대학교 대학원-
dc.subjectManifold Learning-
dc.subjectNon-Euclidean Data-
dc.subjectRiemannian Geometry-
dc.subjectDistortion-
dc.subjectHarmonic Map-
dc.subjectAutoencoder-
dc.subject.ddc621-
dc.titleRiemannian Distortion Measures for Non-Euclidean Data-
dc.title.alternative비유클리드 데이터를 위한 리만기하학적 뒤틀림 측도-
dc.typeThesis-
dc.typeDissertation-
dc.contributor.AlternativeAuthorJang, Cheongjae-
dc.contributor.department공과대학 기계항공공학부-
dc.description.degreeDoctor-
dc.date.awarded2019-08-
dc.contributor.major기계전공-
dc.identifier.uciI804:11032-000000157565-
dc.identifier.holdings000000000040▲000000000041▲000000157565▲-
Appears in Collections:
College of Engineering/Engineering Practice School (공과대학/대학원)Dept. of Mechanical Aerospace Engineering (기계항공공학부)Theses (Ph.D. / Sc.D._기계항공공학부)
Files in This Item:
  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse