期刊论文详细信息
Entropy
Scale-Invariant Divergences for Density Functions
关键词: divergence;    scale invariance;    composite score;    Hölder inequality;    reverse Hölder inequality;   
DOI  :  10.3390/e16052611
来源: mdpi
PDF
【 摘 要 】

Divergence is a discrepancy measure between two objects, such as functions, vectors, matrices, and so forth. In particular, divergences defined on probability distributions are widely employed in probabilistic forecasting. As the dissimilarity measure, the divergence should satisfy some conditions. In this paper, we consider two conditions: The first one is the scale-invariance property and the second is that the divergence is approximated by the sample mean of a loss function. The first requirement is an important feature for dissimilarity measures. The divergence will depend on which system of measurements we used to measure the objects. Scale-invariant divergence is transformed in a consistent way when the system of measurements is changed to the other one. The second requirement is formalized such that the divergence is expressed by using the so-called composite score. We study the relation between composite scores and scale-invariant divergences, and we propose a new class of divergences called Hölder divergence that satisfies two conditions above. We present some theoretical properties of Hölder divergence. We show that Hölder divergence unifies existing divergences from the viewpoint of scale-invariance.

【 授权许可】

CC BY   
© 2014 by the authors; licensee MDPI, Basel, Switzerland

【 预 览 】
附件列表
Files Size Format View
RO202003190026062ZK.pdf 213KB PDF download
  文献评价指标  
  下载次数:11次 浏览次数:21次