期刊论文详细信息
Frontiers in Neuroscience
Bitstream-Based Neural Network for Scalable, Efficient, and Accurate Deep Learning Hardware
Jongeun Lee1  Hyeonuk Sim1 
[1] Neural Processing Research Center, Seoul National University, Seoul, South Korea;School of Electrical and Computer Engineering, Ulsan National Institute of Science and Technology, Ulsan, South Korea;
关键词: bitstream-based neural network;    neuromorphic computing;    stochastic computing;    deep learning hardware;    dynamic precision scaling;    SC-CNN;   
DOI  :  10.3389/fnins.2020.543472
来源: DOAJ
【 摘 要 】

While convolutional neural networks (CNNs) continue to renew state-of-the-art performance across many fields of machine learning, their hardware implementations tend to be very costly and inflexible. Neuromorphic hardware, on the other hand, targets higher efficiency but their inference accuracy lags far behind that of CNNs. To bridge the gap between deep learning and neuromorphic computing, we present bitstream-based neural network, which is both efficient and accurate as well as being flexible in terms of arithmetic precision and hardware size. Our bitstream-based neural network (called SC-CNN) is built on top of CNN but inspired by stochastic computing (SC), which uses bitstreams to represent numbers. Being based on CNN, our SC-CNN can be trained with backpropagation, ensuring very high inference accuracy. At the same time our SC-CNN is deterministic, hence repeatable, and is highly accurate and scalable even to large networks. Our experimental results demonstrate that our SC-CNN is highly accurate up to ImageNet-targeting CNNs, and improves efficiency over conventional digital designs ranging through 50–100% in operations-per-area depending on the CNN and the application scenario, while losing <1% in recognition accuracy. In addition, our SC-CNN implementations can be much more fault-tolerant than conventional digital implementations.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:1次