学位论文详细信息
Universal and succinct source coding of deep neural networks
Deep neural networks;Universal Source Coding
Basu, Sourya ; Varshney ; Lav R
关键词: Deep neural networks;    Universal Source Coding;   
Others  :  https://www.ideals.illinois.edu/bitstream/handle/2142/107959/BASU-THESIS-2020.pdf?sequence=1&isAllowed=y
美国|英语
来源: The Illinois Digital Environment for Access to Learning and Scholarship
PDF
【 摘 要 】

Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based structures that require significant storage space, which limits scaling of deep learning as a service (DLaaS) and use for on-device intelligence. This work is concerned with finding universal lossless compressed representations of deep feedforward networks with synaptic weights drawn from discrete sets, and directly performing inference without full decompression. The basic insight that allows less rate than naive approaches is recognizing that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner. We also provide experimental results of our approach on several standard datasets.

【 预 览 】
附件列表
Files Size Format View
Universal and succinct source coding of deep neural networks 358KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:23次