Publications

Detailed Information

Robust Feature Learning on Sequential Data : 순차 데이터에 대한 견고한 특징 학습

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

권선영

Advisor
윤성로
Major
공과대학 전기·정보공학부
Issue Date
2018-08
Publisher
서울대학교 대학원
Description
학위논문 (박사)-- 서울대학교 대학원 : 공과대학 전기·정보공학부, 2018. 8. 윤성로.
Abstract
A sequence can be defined as a series of related things: a series of words in natural language, symbols in biological sequences, or a time series representing a stock market or biological signal. Drugs can also be expressed as a series of symbols representing chemical properties. Values in a sequence are not assumed to be independent and identically distributed (i.i.d.), but rather have local or long-distance dependencies and hidden patterns of variable length.

This dissertation proposes three methodologies to understand hidden dependencies and patterns in sequential data. First, motivated by the demand for the accurate, fast, and scalable classification of nucleotide sequences against a group of sequences, we propose a new generative classification method based on the variable order Markov model and universal probability. Second, to learn chemical features automatically without manual curation in chemical-chemical interaction (CCI), we propose an end-to-end convolutional neural network (CNN) based on Siamese architecture. Finally, to achieve robust results in quantitative structure activity relationship (QSAR) prediction, we propose comprehensive ensemble learning incorporating an

individual classifier with CNNs and recurrent neural networks (RNNs).

In summary, this dissertation describes methodologies for solving diverse sequential problems. Information theoretic approaches, and neural network of CNN and RNN approaches were used exquisitely, and we confirmed their robust performance.
Language
English
URI
https://hdl.handle.net/10371/143254
Files in This Item:
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share