期刊论文详细信息
Entropy
How to Read Probability Distributions as Statements about Process
关键词: measurement;    maximum entropy;    information theory;    statistical mechanics;    extreme value distributions;    neutral theories in biology;   
DOI  :  10.3390/e16116059
来源: mdpi
PDF
【 摘 要 】

Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken over the measurement scale that relates changes in observed values to changes in information, and the transformation from the underlying scale on which information dissipates to alternative scales on which probability pattern may be expressed. Information invariances set the commonly observed measurement scales and the relations between them. In particular, a measurement scale for information is defined by its invariance to specific transformations of underlying values into measurable outputs. Essentially all common distributions can be understood within this simple framework of information invariance and measurement scale.

【 授权许可】

CC BY   
© 2014 by the authors; licensee MDPI, Basel, Switzerland

【 预 览 】
附件列表
Files Size Format View
RO202003190019411ZK.pdf 321KB PDF download
  文献评价指标  
  下载次数:10次 浏览次数:25次