期刊论文详细信息
Entropy
On Accuracy of PDF Divergence Estimators and Their Applicability to Representative Data Sampling
Marcin Budka1  Bogdan Gabrys1 
[1] Smart Technology Research Group, Bournemouth University, School of Design, Engineering and Computing, Poole House, Talbot Campus, Fern Barrow, Poole BH12 5BB, UK
关键词: cross-validation;    divergence estimation;    generalisation error estimation;    Kullback-Leibler divergence;    sampling;   
DOI  :  10.3390/e13071229
来源: mdpi
PDF
【 摘 要 】

Generalisation error estimation is an important issue in machine learning. Cross-validation traditionally used for this purpose requires building multiple models and repeating the whole procedure many times in order to produce reliable error estimates. It is however possible to accurately estimate the error using only a single model, if the training and test data are chosen appropriately. This paper investigates the possibility of using various probability density function divergence measures for the purpose of representative data sampling. As it turned out, the first difficulty one needs to deal with is estimation of the divergence itself. In contrast to other publications on this subject, the experimental results provided in this study show that in many cases it is not possible unless samples consisting of thousands of instances are used. Exhaustive experiments on the divergence guided representative data sampling have been performed using 26 publicly available benchmark datasets and 70 PDF divergence estimators, and their results have been analysed and discussed.

【 授权许可】

CC BY   
This is an open access article distributed under the Creative Commons Attribution License (CC BY) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

【 预 览 】
附件列表
Files Size Format View
RO202003190048960ZK.pdf 4211KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:32次