期刊论文详细信息
NEUROCOMPUTING 卷:422
DK-CNNs: Dynamic kernel convolutional neural networks
Article
Liu, Jialin1  Chao, Fei1,3  Lin, Chih-Min2  Zhou, Changle1  Shang, Changjing3 
[1] Xiamen Univ, Sch Informat, Dept Artificial Intelligence, Xiamen, Peoples R China
[2] Yuan Ze Univ, Dept Elect Engn, Taoyuan 320, Taiwan
[3] Aberystwyth Univ, Inst Math Phys & Comp Sci, Aberystwyth, Dyfed, Wales
关键词: Deep neural networks;    Convolutional neural networks;    Convolution kernel;   
DOI  :  10.1016/j.neucom.2020.09.005
来源: Elsevier
PDF
【 摘 要 】

This paper introduces dynamic kernel convolutional neural networks (DK-CNNs), an enhanced type of CNN, by performing line-by-line scanning regular convolution to generate a latent dimension of kernel weights. The proposed DK-CNN applies regular convolution to the DK weights, which rely on a latent variable, and discretizes the space of the latent variable to extend a new dimension; this process is named DK convolution. DK convolution increases the expressive capacity of the convolution operation without increasing the number of parameters by searching for useful patterns within the new extended dimen-sion. In contrast to conventional convolution, which applies a fixed kernel to analyse the changed features, DK convolution employs a DK to analyse fixed features. In addition, DK convolution can replace a standard convolution layer in any CNN network structure. The proposed DK-CNNs were compared with different network structures with and without a latent dimension on the CIFAR and FashionMNIST data sets. The experimental results show that DK-CNNs can achieve better performance than regular CNNs. (c) 2020 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2020_09_005.pdf 1623KB PDF download
  文献评价指标  
  下载次数:7次 浏览次数:1次