期刊论文详细信息
Entropy
Algorithmic Relative Complexity
Daniele Cerra1 
[1] German Aerospace Centre (DLR), Münchnerstr. 20, 82234 Wessling, Germany; E-Mail:
关键词: Kolmogorov complexity;    compression;    relative entropy;    Kullback-Leibler divergence;    similarity measure;    compression based distance;   
DOI  :  10.3390/e13040902
来源: mdpi
PDF
【 摘 要 】

Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information theory, and these two approaches share many common traits. In this work, we expand the relations between these two frameworks by introducing algorithmic cross-complexity and relative complexity, counterparts of the cross-entropy and relative entropy (or Kullback-Leibler divergence) found in Shannon’s framework. We define the cross-complexity of an object x with respect to another object y as the amount of computational resources needed to specify x in terms of y, and the complexity of x related to y as the compression power which is lost when adopting such a description for x, compared to the shortest representation of x. Properties of analogous quantities in classical information theory hold for these new concepts. As these notions are incomputable, a suitable approximation based upon data compression is derived to enable the application to real data, yielding a divergence measure applicable to any pair of strings. Example applications are outlined, involving authorship attribution and satellite image classification, as well as a comparison to similar established techniques.

【 授权许可】

CC BY   
© 2011 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190049654ZK.pdf 120KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:8次