Publications

Detailed Information

A Comprehensive Overhaul of Feature Distillation

Cited 221 time in Web of Science Cited 321 time in Scopus
Authors

Heo, Byeongho; Kim, Jeesoo; Yun, Sangdoo; Park, Hyojin; Kwak, Nojun; Choi, Jin Young

Issue Date
2019-02
Publisher
IEEE COMPUTER SOC
Citation
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), pp.1921-1930
Abstract
We investigate the design aspects of feature distillation methods achieving network compression and propose a novel feature distillation method in which the distillation loss is designed to make a synergy among various aspects: teacher transform, student transform, distillation feature position and distance function. Our proposed distillation loss includes a feature transform with a newly designed margin ReLU, a new distillation feature position, and a partial L-2 distance function to skip redundant information giving adverse effects to the compression of student. In ImageNet, our proposed method achieves 21.65% of top-1 error with ResNet50, which outperforms the performance of the teacher network, ResNet152. Our proposed method is evaluated on various tasks such as image classification, object detection and semantic segmentation and achieves a significant performance improvement in all tasks.
ISSN
1550-5499
URI
https://hdl.handle.net/10371/186968
DOI
https://doi.org/10.1109/ICCV.2019.00201
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share