Publications

Detailed Information

Fast Adversarial Training with Dynamic Batch-level Attack Control

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

Jung, Jaewon; Song, Jaeyong; Jang, Hongsun; Lee, Hyeyoon; Choi, Kanghyun; Park, Noseong; Lee, Jinho

Issue Date
2023-07
Publisher
Proceedings - Design Automation Conference
Citation
Proceedings - Design Automation Conference
Abstract
Despite the fact that adversarial training provides an effective protection against adversarial attacks, it suffers from a huge computational overhead. To mitigate the overhead, we propose DBAC, a fast adversarial training with dynamic batch-level attack control. Based on a prior study where attack strength should gradually grow throughout the training, we control the number of samples attacked per batch for better throughput. Additionally, we collect samples from multiple batches to form a pseudo-batch and attack them simultaneously for higher GPU utilization. We implement DBAC using PyTorch to show its superior throughput with similar robust accuracy compared to the prior art.
ISSN
0146-7123
URI
https://hdl.handle.net/10371/196106
DOI
https://doi.org/10.1109/DAC56929.2023.10247930
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Related Researcher

  • College of Engineering
  • Department of Electrical and Computer Engineering
Research Area AI Accelerators, Distributed Deep Learning, Neural Architecture Search

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share