Publications

Detailed Information

DAQ: Channel-Wise Distribution-Aware Quantization for Deep Image Super-Resolution Networks

Cited 0 time in Web of Science Cited 17 time in Scopus
Authors

Hong, Cheeun; Kim, Heewon; Baik, Sungyong; Oh, Junghun; Lee, Kyoung Mu

Issue Date
2022-01
Publisher
Institute of Electrical and Electronics Engineers Inc.
Citation
Proceedings - 2022 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2022, pp.913-922
Abstract
© 2022 IEEE.Since the resurgence of deep neural networks (DNNs), image super-resolution (SR) has recently seen a huge progress in improving the quality of low resolution images, however at the great cost of computations and resources. Recently, there has been several efforts to make DNNs more efficient via quantization. However, SR demands pixel-level accuracy in the system, it is more difficult to perform quantization without significantly sacrificing SR performance. To this end, we introduce a new ultra-low precision yet effective quantization approach specifically designed for SR. In particular, we observe that in recent SR networks, each channel has different distribution characteristics. Thus we propose a channel-wise distribution-aware quantization scheme. Experimental results demonstrate that our proposed quantization, dubbed Distribution-Aware Quantization (DAQ), manages to greatly reduce the computational and resource costs without the significant sacrifice in SR performance, compared to other quantization methods.
URI
https://hdl.handle.net/10371/184017
DOI
https://doi.org/10.1109/WACV51458.2022.00099
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share