学位论文详细信息
Network compression via network memory: fundamental performance limits
Universal compression;Source coding;Redundancy elimination;Memory-assisted compression;Information theory;Distributed source coding;Side information;Redundancy-capacity theorem
Beirami, Ahmad ; Fekri, Faramarz Electrical and Computer Engineering Bloch, Matthieu Sivakumar, Raghupathy Barry, John Weiss, Howard McLaughlin, Steven W. ; Fekri, Faramarz
University:Georgia Institute of Technology
Department:Electrical and Computer Engineering
关键词: Universal compression;    Source coding;    Redundancy elimination;    Memory-assisted compression;    Information theory;    Distributed source coding;    Side information;    Redundancy-capacity theorem;   
Others  :  https://smartech.gatech.edu/bitstream/1853/53448/1/BEIRAMI-DISSERTATION-2014.pdf
美国|英语
来源: SMARTech Repository
PDF
【 摘 要 】
The amount of information that is churned out daily around the world is staggering, and hence, future technological advancements are contingent upon development of scalable acquisition, inference, and communication mechanisms for this massive data. This Ph.D. dissertation draws upon mathematical tools from information theory and statistics to understand the fundamental performance limits of universal compression of this massive data at the packet level using universal compression just above layer 3 of the network when the intermediate network nodes are enabled with the capability of memorizing the previous traffic. Universality of compression imposes an inevitable redundancy (overhead) to the compression performance of universal codes, which is due to the learning of the unknown source statistics.In this work, the previous asymptotic results about the redundancy of universal compression are generalized to consider the performance of universal compression at the finite-length regime (that is applicable to small network packets). Further, network compression via memory is proposed as a compression-based solution for the compression of relatively small network packets whenever the network nodes (i.e., the encoder and the decoder) are equipped with memory and have access to massive amounts of previous communication. In a nutshell, network compression via memory learns the patterns and statistics of the payloads of the packets and uses it for compression and reduction of the traffic. Network compression via memory, with the cost of increasing the computational overhead in the network nodes, significantly reduces the transmission cost in the network. This leads to huge performance improvement as the cost of transmitting one bit is by far greater than the cost of processing it.
【 预 览 】
附件列表
Files Size Format View
Network compression via network memory: fundamental performance limits 966KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:13次