期刊论文详细信息
Entropy
A Novel Nonparametric Distance Estimator for Densities with Error Bounds
Alexandre R.F. Carvalho1  João Manuel R. S. Tavares1 
[1] Instituto de Engenharia Mecânica e Gestão Industrial, Faculdade de Engenharia, Universidade do Porto; Rua Dr. Roberto Frias, s/n 4200-465 Porto, Portugal; E-Mail:
关键词: generalized differential entropies;    generalized differential divergences;    Tsallis entropy;    Hellinger metric;    nonparametric estimators;    heterocedastic data;   
DOI  :  10.3390/e15051609
来源: mdpi
PDF
【 摘 要 】

The use of a metric to assess distance between probability densities is an important practical problem. In this work, a particular metric induced by an α-divergence is studied. The Hellinger metric can be interpreted as a particular case within the framework of generalized Tsallis divergences and entropies. The nonparametric Parzen’s density estimator emerges as a natural candidate to estimate the underlying probability density function, since it may account for data from different groups, or experiments with distinct instrumental precisions, i.e., non-independent and identically distributed (non-i.i.d.) data. However, the information theoretic derived metric of the nonparametric Parzen’s density estimator displays infinite variance, limiting the direct use of resampling estimators. Based on measure theory, we present a change of measure to build a finite variance density allowing the use of resampling estimators. In order to counteract the poor scaling with dimension, we propose a new nonparametric two-stage robust resampling estimator of Hellinger’s metric error bounds for heterocedastic data. The approach presents very promising results allowing the use of different covariances for different clusters with impact on the distance evaluation.

【 授权许可】

CC BY   
© 2013 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190036537ZK.pdf 620KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:1次