| Entropy | |
| Entropy Measures vs. Kolmogorov Complexity | |
| 关键词: Kolmogorov complexity; Shannon entropy; Rényi entropy; Tsallis entropy; | |
| DOI : 10.3390/e13030595 | |
| 来源: DOAJ | |
【 摘 要 】
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Rényi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.
【 授权许可】
Unknown