期刊论文详细信息
Proceedings
On the Estimation of Mutual Information
Jesse Ernst1  Nicholas Carrara1 
[1]Physics department, University at Albany, 1400 Washington Ave, Albany, NY 12222, USA
关键词: mutual information;    non-parametric entropy estimation;    dimension reduction;    machine learning;   
DOI  :  10.3390/proceedings2019033031
来源: DOAJ
【 摘 要 】
In this paper we focus on the estimation of mutual information from finite samples ( X × Y ) . The main concern with estimations of mutual information (MI) is their robustness under the class of transformations for which it remains invariant: i.e., type I (coordinate transformations), III (marginalizations) and special cases of type IV (embeddings, products). Estimators which fail to meet these standards are not robust in their general applicability. Since most machine learning tasks employ transformations which belong to the classes referenced in part I, the mutual information can tell us which transformations are most optimal. There are several classes of estimation methods in the literature, such as non-parametric estimators like the one developed by Kraskov et al., and its improved versions. These estimators are extremely useful, since they rely only on the geometry of the underlying sample, and circumvent estimating the probability distribution itself. We explore the robustness of this family of estimators in the context of our design criteria.
【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次