学位论文详细信息
A hardware acceleration technique for gradient descent and conjugate gradient
Gradient Descent;Conjugate Gradient;Hardware Acceleration;Matrix Multiplication
Kesler, David R. ; Kumar ; Rakesh
关键词: Gradient Descent;    Conjugate Gradient;    Hardware Acceleration;    Matrix Multiplication;   
Others  :  https://www.ideals.illinois.edu/bitstream/handle/2142/24241/Kesler_David.pdf?sequence=1&isAllowed=y
美国|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

Gradient descent, conjugate gradient, and other iterative algorithms are apowerful class of algorithms; however, they can take a long time for conver-gence. Baseline accelerator designs feature insu cient coverage of operationsand do not work well on the problems we target. In this thesis we presenta novel hardware architecture for accelerating gradient descent and othersimilar algorithms. To support this architecture, we also present a sparsematrix-vector storage format, and software support for utilizing the format,so that it can be e ciently mapped onto hardware which is also well suited fordense operations. We show that the accelerator design outperforms similardesigns which target only the most dominant operation of a given algorithm,providing substantial energy and performance bene ts. We further show thatthe accelerator can be reasonably implemented on a general purpose CPUwith small area overhead.

【 预 览 】
附件列表
Files Size Format View
A hardware acceleration technique for gradient descent and conjugate gradient 734KB PDF download
  文献评价指标  
  下载次数:9次 浏览次数:11次