期刊论文详细信息
Entropy
A Characterization of Entropy in Terms of Information Loss
John C. Baez2  Tobias Fritz1 
[1] Institut de Ciències Fotòniques, Mediterranean Technology Park, 08860 Castelldefels (Barcelona), Spain;Department of Mathematics, University of California, Riverside, CA 92521, USA; E-Mail:
关键词: Shannon entropy;    Tsallis entropy;    information theory;    measure-preserving function;   
DOI  :  10.3390/e13111945
来源: mdpi
PDF
【 摘 要 】

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.

【 授权许可】

CC BY   
© 2011 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190047055ZK.pdf 245KB PDF download
  文献评价指标  
  下载次数:24次 浏览次数:23次