Publications

Detailed Information

Batch Normalization Tells You Which Filter is Important

Cited 0 time in Web of Science Cited 4 time in Scopus
Authors

Oh, Junghun; Kim, Heewon; Baik, Sungyong; Hong, Cheeun; Lee, Kyoung Mu

Issue Date
2022-01
Publisher
Institute of Electrical and Electronics Engineers Inc.
Citation
Proceedings - 2022 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2022, pp.3351-3360
Abstract
© 2022 IEEE.The goal of filter pruning is to search for unimportant filters to remove in order to make convolutional neural networks (CNNs) efficient without sacrificing the performance in the process. The challenge lies in finding information that can help determine how important or relevant each filter is with respect to the final output of neural networks. In this work, we share our observation that the batch normalization (BN) parameters of pre-trained CNNs can be used to estimate the feature distribution of activation outputs, without processing of training data. Upon observation, we propose a simple yet effective filter pruning method by evaluating the importance of each filter based on the BN parameters of pre-trained CNNs. The experimental results on CIFAR-10 and ImageNet demonstrate that the proposed method can achieve outstanding performance with and without fine-tuning in terms of the trade-off between the accuracy drop and the reduction in computational complexity and number of parameters of pruned networks.
URI
https://hdl.handle.net/10371/184016
DOI
https://doi.org/10.1109/WACV51458.2022.00341
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share