Publications

Detailed Information

An approximate memory architecture for a reduction of refresh power consumption in deep learning applications

Cited 50 time in Web of Science Cited 50 time in Scopus
Authors

Duy Thanh Nguyen; Kim, Hyun; Lee, Hyuk-Jae; Chang, Ik-joon

Issue Date
2018-05
Publisher
IEEE
Citation
2018 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), p. 18228810
Abstract
A DRAM device requires periodic refresh operations to preserve data integrity, which incurs significant power consumption. This paper proposes a new memory architecture to reduce the power consumption by refresh operations by slowing down the refresh rate. Slow refresh may cause a loss of data stored in a DRAM cell, which affects the correctness of the computation using the lost data. The proposed memory architecture attempts to avoid the problem caused by lost data by taking advantage of the error-tolerant property of deep learning applications that are tolerant to presence of a small amount of errors. For data storage in deep learning applications, the approximate DRAM architecture stores the data in a transposed manner so that data are sorted according to their significance. DRAM organization is modified to support the control of the refresh period according to the significance of stored data. Simulation results with GoogLeNet and VGG-16 show that the power consumption is reduced by 69.68% with a negligible drop of the classification accuracy for both GoogLeNet and VGG16.
ISSN
0271-4302
URI
https://hdl.handle.net/10371/186858
DOI
https://doi.org/10.1109/ISCAS.2018.8351021
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share