期刊论文详细信息
NEUROCOMPUTING 卷:313
Combining multiple algorithms in classifier ensembles using generalized mixture functions
Article
Costa, Valdigleis S.1  Farias, Antonio Diego S.1,2  Bedregal, Benjamin1  Santiago, Regivan H. N.1  Canuto, Anne Magaly de P.1 
[1] Fed Univ Rio Grande Norte UFRN, Dept Informat & Appl Math, BR-59072970 Natal, RN, Brazil
[2] Fed Rural Univ Semiarid UFERSA, Dept Exacts & Nat Sci, BR-59900000 Pau Dos Ferros, RN, Brazil
关键词: Classifier ensembles;    Aggregation functions;    Pre-aggregation functions;    Generalized mixture functions;   
DOI  :  10.1016/j.neucom.2018.06.021
来源: Elsevier
PDF
【 摘 要 】

Classifier ensembles are pattern recognition structures composed of a set of classification algorithms (members), organized in a parallel way, and a combination method with the aim of increasing the classification accuracy of a classification system. In this study, we investigate the application of a generalized mixture (GM) functions as a new approach for providing an efficient combination procedure for these systems through the use of dynamic weights in the combination process. Therefore, we present three GM functions to be applied as a combination method. The main advantage of these functions is that they can define dynamic weights at the member outputs, making the combination process more efficient. In order to evaluate the feasibility of the proposed approach, an empirical analysis is conducted, applying classifier ensembles to 25 different classification data sets. In this analysis, we compare the use of the proposed approaches to ensembles using traditional combination methods as well as the state-of-the-art ensemble methods. Our findings indicated gains in terms of performance when comparing the proposed approaches to the traditional ones as well as comparable results with the state-of-the-art methods. (C) 2018 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2018_06_021.pdf 1011KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:2次