Publications

Detailed Information

Improving Speed of MUX-FSM-based Stochastic Computing for On-device Neural Networks

Cited 0 time in Web of Science Cited 0 time in Scopus
Authors

Kang, Jongsung; Kim, Taewhan

Issue Date
2022-06
Publisher
Korean Institute of Information Scientists and Engineers
Citation
Journal of Computing Science and Engineering, Vol.16 No.2, pp.79-87
Abstract
© 2022. The Korean Institute of Information Scientists and EngineersWe propose an acceleration technique for processing multiplication operations using stochastic computing (SC) in ondevice neural networks. Recently, multiplexor driven finite state machine (MUX-FSM)-based SCs, which employ a MUX controlled by an FSM to generate a (repeated but short) bit sequence of a binary number to count up for a multiplication operation, considerably reduce the processing time of MAC operations over the traditional stochastic number generator (SNG) based SC. Nevertheless, the existing MUX-FSM-based SCs still do not meet the multiplication processing time required for the wide adoption of on-device neural networks in practice even though it offers a very economical hardware implementation. In this respect, this work proposes a solution that speeds up the conventional MUX-FSMbased SCs. Precisely, we analyze the bit counting pattern produced by MUX-FSM and replace the counting redundancy by shift operation, resulting in a shortening of the length of the required bit sequence significantly, together with analytically formulating the number of computation cycles. Through experiments, we have shown that the enhanced SC technique can reduce the processing time by 44.1% on average over the conventional MUX-FSM-based SCs.
ISSN
1976-4677
URI
https://hdl.handle.net/10371/185295
DOI
https://doi.org/10.5626/JCSE.2022.16.2.79
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share