期刊论文详细信息
PATTERN RECOGNITION 卷:77
Training neural network classifiers through Bayes risk minimization applying unidimensional Parzen windows
Article
Lazaro, Marcelino1  Hayes, Monson H.2  Figueiras-Vidal, Anibal R.1 
[1] Univ Carlos III Madrid, Signal Theory & Commun Dept, Getafe, Spain
[2] George Mason Univ, Dept Elect & Comp Engn, Fairfax, VA 22030 USA
关键词: Bayes risk;    Parzen windows;    Binary classification;   
DOI  :  10.1016/j.patcog.2017.12.018
来源: Elsevier
PDF
【 摘 要 】

A new training algorithm for neural networks in binary classification problems is presented. It is based on the minimization of an estimate of the Bayes risk by using Parzen windows applied to the final one-dimensional nonlinear transformation of the samples to estimate the probability of classification error. This leads to a very general approach to error minimization and training, where the risk that is to be minimized is defined in terms of integrated one-dimensional Parzen windows, and the gradient descent algorithm used to minimize this risk is a function of the window that is used. By relaxing the constraints that are typically applied to Parzen windows when used for probability density function estimation, for example by allowing them to be non-symmetric or possibly infinite in duration, an entirely new set of training algorithms emerge. In particular, different Parzen windows lead to different cost functions, and some interesting relationships with classical training methods are discovered. Experiments with synthetic and real benchmark datasets show that with the appropriate choice of window, fitted to the specific problem, it is possible to improve the performance of neural network classifiers over those that are trained using classical methods. (C) 2017 Elsevier Ltd. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_patcog_2017_12_018.pdf 841KB PDF download
  文献评价指标  
  下载次数:5次 浏览次数:0次