期刊论文详细信息
Entropy
k-Nearest Neighbor Based Consistent Entropy Estimation for Hyperspherical Distributions
Shengqiao Li1  Robert M. Mnatsakanov1 
[1] Health Effects Laboratory Division, National Institute for Occupational Safety and Health, Morgantown, WV 26505, USA; E-Mail:
关键词: hyperspherical distribution;    directional data;    differential entropy;    cross entropy;    Kullback-Leibler divergence;    k-nearest neighbor;   
DOI  :  10.3390/e13030650
来源: mdpi
PDF
【 摘 要 】

A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable.

【 授权许可】

CC BY   
© 2011 by the authors; licensee MDPI, Basel, Switzerland.

【 预 览 】
附件列表
Files Size Format View
RO202003190050351ZK.pdf 301KB PDF download
  文献评价指标  
  下载次数:19次 浏览次数:5次