Publications

Detailed Information

Paraphrasing Complex Network: Network Compression via Factor Transfer

Cited 161 time in Web of Science Cited 298 time in Scopus
Authors

Kim, Jangho; Park, SeongUk; Kwak, Nojun

Issue Date
2018
Publisher
NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)
Citation
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), Vol.31
Abstract
Many researchers have sought ways of model compression to reduce the size of a deep neural network (DNN) with minimal performance degradation in order to use DNNs in embedded systems. Among the model compression methods, a method called knowledge transfer is to train a student network with a stronger teacher network. In this paper, we propose a novel knowledge transfer method which uses convolutional operations to paraphrase teacher's knowledge and to translate it for the student. This is done by two convolutional modules, which are called a paraphraser and a translator. The paraphraser is trained in an unsupervised manner to extract the teacher factors which are defined as paraphrased information of the teacher network. The translator located at the student network extracts the student factors and helps to translate the teacher factors by mimicking them. We observed that our student network trained with the proposed factor transfer method outperforms the ones trained with conventional knowledge transfer methods.
ISSN
1049-5258
URI
https://hdl.handle.net/10371/206569
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Related Researcher

  • Graduate School of Convergence Science & Technology
  • Department of Intelligence and Information
Research Area Feature Selection and Extraction, Object Detection, Object Recognition

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share