Entropy | |
Algorithmic Relative Complexity | |
Daniele Cerra1  | |
[1] German Aerospace Centre (DLR), Münchnerstr. 20, 82234 Wessling, Germany; E-Mail: | |
关键词: Kolmogorov complexity; compression; relative entropy; Kullback-Leibler divergence; similarity measure; compression based distance; | |
DOI : 10.3390/e13040902 | |
来源: mdpi | |
【 摘 要 】
Information content and compression are tightly related concepts that can be addressed through both classical and algorithmic information theories, on the basis of Shannon entropy and Kolmogorov complexity, respectively. The definition of several entities in Kolmogorov’s framework relies upon ideas from classical information theory, and these two approaches share many common traits. In this work, we expand the relations between these two frameworks by introducing algorithmic cross-complexity and relative complexity, counterparts of the cross-entropy and relative entropy (or Kullback-Leibler divergence) found in Shannon’s framework. We define the cross-complexity of an object
【 授权许可】
CC BY
© 2011 by the authors; licensee MDPI, Basel, Switzerland.
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202003190049654ZK.pdf | 120KB | download |