Publications

Detailed Information

MUTE: Inter-class Ambiguity Driven Multi-hot Target Encoding for Deep Neural Network Design

Cited 0 time in Web of Science Cited 2 time in Scopus
Authors

Jaiswal, Mayoore S.; Kang, Bumsoo; Lee, Jinho; Cho, Minsik

Issue Date
2020
Publisher
IEEE COMPUTER SOC
Citation
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), pp.3254-3263
Abstract
Target encoding is an effective technique to boost performance of classical and deep neural networks based classification models. However, the existing target encoding approaches require significant increase in the learning capacity, thus demand higher computation power and more training data. In this paper, we present a novel and efficient target encoding method, Inter-class Ambiguity Driven Multi-hot Target Encoding (MUTE), to improve both generalizability and robustness of a classification model by understanding the inter-class characteristics of a target dataset. By evaluating ambiguity between the target classes in a dataset, MUTE strategically optimizes the Hamming distances among target encoding. Such optimized target encoding offers higher classification strength for neural network models with negligible computation overhead and without increasing the model size. When MUTE is applied to the popular image classification networks and datasets, our experimental results show that MUTE offers better generalization and defense against the noises and adversarial attacks over the existing solutions.
ISSN
2160-7508
URI
https://hdl.handle.net/10371/200509
DOI
https://doi.org/10.1109/CVPRW50498.2020.00385
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Related Researcher

  • College of Engineering
  • Department of Electrical and Computer Engineering
Research Area AI Accelerators, Distributed Deep Learning, Neural Architecture Search

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share