期刊论文详细信息
Entropy
Entropy Approximation in Lossy Source Coding Problem
Marek Śmieja1  Jacek Tabor2 
[1] Department of Mathematics and Computer Science, Jagiellonian University, Lojasiewicza 6, 30-348 Kraków, Poland; E-Mail
关键词: Shannon entropy;    entropy approximation;    minimum entropy set cover;    lossy compression;    source coding;   
DOI  :  10.3390/e17053400
来源: mdpi
PDF
【 摘 要 】

In this paper, we investigate a lossy source coding problem, where an upper limit on the permitted distortion is defined for every dataset element. It can be seen as an alternative approach to rate distortion theory where a bound on the allowed average error is specified. In order to find the entropy, which gives a statistical length of source code compatible with a fixed distortion bound, a corresponding optimization problem has to be solved. First, we show how to simplify this general optimization by reducing the number of coding partitions, which are irrelevant for the entropy calculation. In our main result, we present a fast and feasible for implementation greedy algorithm, which allows one to approximate the entropy within an additive error term of log2 e. The proof is based on the minimum entropy set cover problem, for which a similar bound was obtained.

【 授权许可】

CC BY   
© 2015 by the authors; licensee MDPI, Basel, Switzerland

【 预 览 】
附件列表
Files Size Format View
RO202003190012533ZK.pdf 359KB PDF download
  文献评价指标  
  下载次数:8次 浏览次数:15次