期刊论文详细信息
Entropy
A Utility-Based Approach to Some Information Measures
Craig Friedman1  Jinggang Huang1 
[1] Standard & Poor’s, 55 Water Street, 46th Floor, New York, NY 10041, USA
关键词: Generalized Entropy;    Generalized Kullback-Leibler Relative Entropy;    Decision The- ory;    Expected Utility;    Horse Race;    Tsallis Entropy;    Statistical Learning;    Probability Estimation;    Risk Neutral Pricing Measure;   
DOI  :  10.3390/e9010001
来源: mdpi
PDF
【 摘 要 】

We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of thesegeneralized quantities. We then consider these generalized quantities in an easily interpreted spe-cial case. We show that the resulting quantities, share many of the properties of entropy andrelative entropy, such as the data processing inequality and the second law of thermodynamics.We formulate an important statistical learning problem – probability estimation – in terms of ageneralized relative entropy. The solution of this problem reflects general risk preferences via theutility function; moreover, the solution is optimal in a sense of robust absolute performance.

【 授权许可】

CC BY   
This is an open access article distributed under the Creative Commons Attribution License (CC BY) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

【 预 览 】
附件列表
Files Size Format View
RO202003190059173ZK.pdf 196KB PDF download
  文献评价指标  
  下载次数:21次 浏览次数:64次