Browse

Deep Bayesian Neural Networks for Continual Learning
지속학습을 위한 심층 베이지안 신경망

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors
손성호
Advisor
장병탁
Major
공과대학 컴퓨터공학부
Issue Date
2018-02
Publisher
서울대학교 대학원
Keywords
deep learningbayesian neural networkcontinual learningcatastrophic forgettinguncertainty estimation
Description
학위논문 (석사)-- 서울대학교 대학원 : 공과대학 컴퓨터공학부, 2018. 2. 장병탁.
Abstract
In the problem of continual learning, tasks are given in sequence and the goal of the machine is to learn new tasks while retaining previously learned tasks' performances. Even though deep neural networks have become prevalent among machine learning techniques recently, they may fail to deal with continual learning environment. As the model learns new tasks, parameters contributing largely to previous tasks' performances may change and result in poor performance. This phenomenon is called catastrophic forgetting, and researchers tackled this problem using regularizers and structure optimization techniques.

We aim to show that applying Bayesian modelling to deep neural network is beneficial for continual learning. Not only does Bayesian framework provide systematic way of performing online learning, it also provides uncertainty estimates for measuring each parameter's contribution to previously learned tasks' performances. By rescaling gradients applied to parameters with large performance contribution, the model can retain performances to previous tasks longer. This thesis shows how deep Bayesian neural networks utilize model uncertainty, which leads to alleviation of catastrophic forgetting in continual learning environment.
Language
English
URI
https://hdl.handle.net/10371/141547
Files in This Item:
Appears in Collections:
College of Engineering/Engineering Practice School (공과대학/대학원)Dept. of Computer Science and Engineering (컴퓨터공학부)Theses (Master's Degree_컴퓨터공학부)
  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse