Publications
Detailed Information
EFFICIENT TRANSFER LEARNING SCHEMES FOR PERSONALIZED LANGUAGE MODELING USING RECURRENT NEURAL NETWORK : 순환신경망을 이용한 개인화 언어모델 학습 방법에 관한 연구
Cited 0 time in
Web of Science
Cited 0 time in Scopus
- Authors
- Advisor
- 정교민
- Major
- 공과대학 전기·정보공학부
- Issue Date
- 2017-02
- Publisher
- 서울대학교 대학원
- Description
- 학위논문 (석사)-- 서울대학교 대학원 : 전기·정보공학부, 2017. 2. 정교민.
- Abstract
- In this paper, we propose an efficient transfer leaning methods for training a personalized language model using a recurrent neural network with long short-term memory architecture. With our proposed fast transfer learning schemes, a general language model is updated to a personalized language model with a small amount of user data and a limited computing resource. These methods can be applied especially useful to a mobile device environment while the data is prevented from transferring out of the device for privacy purposes. Through experiments on dialogue data in a drama, it is verified that our transfer learning methods have successfully generated the personalized language model.
- Language
- English
- Files in This Item:
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.