Publications
Detailed Information
A Comprehensive Overhaul of Feature Distillation
Cited 287 time in
Web of Science
Cited 389 time in Scopus
- Authors
- Issue Date
- 2019-10
- Publisher
- IEEE COMPUTER SOC
- Citation
- 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), Vol.2019-October, pp.1921-1930
- Abstract
- We investigate the design aspects of feature distillation methods achieving network compression and propose a novel feature distillation method in which the distillation loss is designed to make a synergy among various aspects: teacher transform, student transform, distillation feature position and distance function. Our proposed distillation loss includes a feature transform with a newly designed margin ReLU, a new distillation feature position, and a partial L-2 distance function to skip redundant information giving adverse effects to the compression of student. In ImageNet, our proposed method achieves 21.65% of top-1 error with ResNet50, which outperforms the performance of the teacher network, ResNet152. Our proposed method is evaluated on various tasks such as image classification, object detection and semantic segmentation and achieves a significant performance improvement in all tasks.
- ISSN
- 1550-5499
- Files in This Item:
- There are no files associated with this item.
- Appears in Collections:
Related Researcher
- Graduate School of Convergence Science & Technology
- Department of Intelligence and Information
Item View & Download Count
Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.