期刊论文详细信息
Entropy
Mixture of Experts with Entropic Regularization for Data Classification
Alvaro Soto1  Ariel Saavedra2  Luis Caro2  Billy Peralta3 
[1] Department of Computer Sciences, Pontifical Catholic University of Chile, Santiago 7820436, Chile;Department of Engineering Informatics, Catholic University of Temuco, Temuco 4781312, Chile;Department of Engineering Science, Andres Bello University, Santiago 7500971, Chile;
关键词: mixture-of-experts;    regularization;    entropy;    classification;   
DOI  :  10.3390/e21020190
来源: DOAJ
【 摘 要 】

Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. “Mixture-of-experts„ is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a “winner-takes-all„ output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3⁻6% in some datasets. In future work, we plan to embed feature selection into this model.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次