期刊论文详细信息
| Entropy | |
| k-Nearest Neighbor Based Consistent Entropy Estimation for Hyperspherical Distributions | |
| 关键词: hyperspherical distribution; directional data; differential entropy; cross entropy; Kullback-Leibler divergence; k-nearest neighbor; | |
| DOI : 10.3390/e13030650 | |
| 来源: DOAJ | |
【 摘 要 】
A consistent entropy estimator for hyperspherical data is proposed based on the k-nearest neighbor (knn) approach. The asymptotic unbiasedness and consistency of the estimator are proved. Moreover, cross entropy and Kullback-Leibler (KL) divergence estimators are also discussed. Simulation studies are conducted to assess the performance of the estimators for models including uniform and von Mises-Fisher distributions. The proposed knn entropy estimator is compared with the moment based counterpart via simulations. The results show that these two methods are comparable.
【 授权许可】
Unknown