期刊论文详细信息
Entropy
Estimating Functions of Distributions Defined over Spaces of Unknown Size
David H. Wolpert1 
[1] Santa Fe Institute, 1399 Hyde Park Rd., Santa Fe, NM 87501, USA; E-Mail:
关键词: Bayesian analysis;    entropy;    mutual information;    variable number of bins;    hidden variables;    Dirichlet prior;   
DOI  :  10.3390/e15114668
来源: mdpi
PDF
【 摘 要 】

We consider Bayesian estimation of information-theoretic quantities from data, using a Dirichlet prior. Acknowledging the uncertainty of the event space size m and the Dirichlet prior’s concentration parameter c, we treat both as random variables set by a hyperprior. We show that the associated hyperprior, . Thus, requiring IUV greatly reduces the number of degrees of freedom of the hyperprior. Some information-theoretic quantities can be expressed multiple ways, in terms of different event spaces, e.g., mutual information. With all hyperpriors (implicitly) used in earlier work, different choices of this event space lead to different posterior expected values of these information-theoretic quantities. We show that there is no such dependence on the choice of event space for a hyperprior that obeys IUV. We also derive a result that allows us to exploit IUV to greatly simplify calculations, like the posterior expected mutual information or posterior expected multi-information. We also use computer experiments to favorably compare an IUV-based estimator of entropy to three alternative methods in common use. We end by discussing how seemingly innocuous changes to the formalization of an estimation problem can substantially affect the resultant estimates of posterior expectations.

【 授权许可】

CC BY   
© 2013 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190032190ZK.pdf 458KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:10次