期刊论文详细信息
NEUROCOMPUTING 卷:162
DropELM: Fast neural network regularization with Dropout and DropConnect
Article
Iosifidis, Alexandros1  Tefas, Anastasios1  Pitas, Ioannis1 
[1] Aristotle Univ Thessaloniki, Dept Informat, Thessaloniki 54124, Greece
关键词: Single Hidden Layer Feedforward Networks;    Extreme Learning Machine;    Regularization;    Dropout;    DropConnect;   
DOI  :  10.1016/j.neucom.2015.04.006
来源: Elsevier
PDF
【 摘 要 】

In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden Layer Feedforward Neural network training that incorporates Dropout and DropConnect regularization in its optimization process. We show that both types of regularization lead to the same solution for the network output weights calculation, which is adopted by the proposed DropELM network. The proposed algorithm is able to exploit Dropout and DropConnect regularization, without computationally intensive iterative weight tuning. We show that the adoption of such a regularization approach can lead to better solutions for the network output weights. We incorporate the proposed regularization approach in several recently proposed ELM algorithms and show that their performance can be enhanced without requiring much additional computational cost. (C) 2015 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2015_04_006.pdf 852KB PDF download
  文献评价指标  
  下载次数:14次 浏览次数:0次