期刊论文详细信息
ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS | |
Low-Cost Stochastic Hybrid Multiplier for Quantized Neural Networks | |
Article; Proceedings Paper | |
Li, Bingzhe1  Najafi, M. Hassan2  Lilja, David J.1  | |
[1]Univ Minnesota, Minneapolis, MN 55455 USA. | |
[2]Univ Louisiana Lafayette, Lafayette, LA 70503 USA. | |
关键词: Stochastic computing; quantized neural network; mutiplier; low powerdesign; COMPUTATION; IMPLEMENTATION; FPGA; | |
DOI : 10.1145/3309882 | |
来源: SCIE | |
【 摘 要 】
With increased interests of neural networks, hardware implementations of neural networks have been investigated. Researchers pursue low hardware cost by using different technologies such as stochastic computing (SC) and quantization. More specifically, the quantization is able to reduce total number of trained weights and results in low hardware cost. SC aims to lower hardware costs substantially by using simple gates instead of complex arithmetic operations. However, the advantages of both quantization and SC in neural networks are not well investigated. In this article, we propose a new stochastic multiplier with simple CMOS transistors called the stochastic hybrid multiplier for quantized neural networks. The new design uses the characteristic of quantized weights and tremendously reduces the hardware cost of neural networks. Experimental results indicate that our stochastic design achieves about 7.7x energy reduction compared to its counterpart binary implementation while maintaining slightly higher recognition error rates than the binary implementation. Compared to previous stochastic neural network implementations, our work derives at least 4x, 9x, and 10x reduction in terms of area, power, and energy, respectively.【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202303092259082ZK.pdf | 4665KB | download |
【 参考文献 】
- [1]
- [2]
- [3]
- [4]
- [5]
- [6]
- [7]
- [8]
- [9]
- [10]
- [11]
- [12]
- [13]
- [14]
- [15]
- [16]
- [17]
- [18]
- [19]
- [20]
- [21]
- [22]
- [23]
- [24]
- [25]
- [26]
- [27]
- [28]
- [29]
- [30]
- [31]
- [32]
- [33]
- [34]
- [35]