NEUROCOMPUTING | 卷:237 |
An approximate backpropagation learning rule for memristor based neural networks using synaptic plasticity | |
Article | |
Negrov, D.1  Karandashev, I.1,2  Shakirov, V.1,2  Matveyev, Yu.1,3  Dunin-Barkowski, W.1,2  Zenkevich, A.1,3  | |
[1] Moscow Inst Phys & Technol, Lab Funct Mat & Devices Nanoelect, Dolgoprudnyi, Russia | |
[2] Russian Acad Sci, Sci Res Inst Syst Anal, Moscow, Russia | |
[3] Natl Res Nucl Univ MEPhI, Moscow, Russia | |
关键词: Deep learning; Memristor; Neural networks; Hardware design; Backpropagation algorithm; | |
DOI : 10.1016/j.neucom.2016.10.061 | |
来源: Elsevier | |
【 摘 要 】
We describe an approximation to backpropagation algorithm for training deep neural networks, which is designed to work with synapses implemented with memristors. The key idea is to represent the values of both the input signal and the backpropagated delta value with a series of pulses that trigger multiple positive or negative updates of the synaptic weight, and to use the min operation instead of the product of the two signals. In computational simulations, we show that the proposed approximation to backpropagation is well converged and may be suitable for memristor implementations of multilayer neural networks.
【 授权许可】
Free
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
10_1016_j_neucom_2016_10_061.pdf | 827KB | download |