Publications

Detailed Information

Energy-Efficient Inference Accelerator for Memory-Augmented Neural Networks on an FPGA

Cited 7 time in Web of Science Cited 8 time in Scopus
Authors

Park, Seongsik; Jang, Jaehee; Kim, Seijoon; Yoon, Sungroh

Issue Date
2019-05
Publisher
IEEE
Citation
2019 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), pp.1587-1590
Abstract
Memory-augmented neural networks (MANNs) are designed for question-answering tasks. It is difficult to run a MANN effectively on accelerators designed for other neural networks (NNs), in particular on mobile devices, because MANNs require recurrent data paths and various types of operations related to external memory access. We implement an accelerator for MANNs on a field-programmable gate array (FPGA) based on a data flow architecture. Inference times are also reduced by inference thresholding, which is a data-based maximum inner-product search specialized for natural language tasks. Measurements on the bAbI data show that the energy efficiency of the accelerator (FLOPS/kJ) was higher than that of an NVIDIA TITAN V GPU by a factor of about 125, increasing to 140 with inference thresholding.
ISSN
1530-1591
URI
https://hdl.handle.net/10371/186441
DOI
https://doi.org/10.23919/DATE.2019.8715013
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share