期刊论文详细信息
NEUROCOMPUTING 卷:412
Learning longer-term dependencies via grouped distributor unit
Article
Luo, Wei1  Yu, Feng1 
[1] Zhejiang Univ, Coll Biomed Engn & Instrument Sci, Yuquan Campus,38 Zheda Rd, Hangzhou 310027, Peoples R China
关键词: Recurrent neural network;    Sequence learning;    Long-term memory;   
DOI  :  10.1016/j.neucom.2020.06.105
来源: Elsevier
PDF
【 摘 要 】

Learning long-term dependencies remains difficult for recurrent neural networks (RNNs) despite their success in sequence modeling recently. In this paper, we propose a novel gated RNN structure, which contains only one gate. Hidden states in the proposed grouped distributor unit (GDU) are partitioned into groups. For each group, the proportion of memory to be overwritten in each state transition is limited to a constant and is adaptively distributed to each group member. In other words, every separate group has a fixed overall update rate, yet all units are allowed to have different paces. Information is therefore forced to be latched in a flexible way, which helps the model to capture long-term dependencies in data. Besides having a simpler structure, GDU is demonstrated experimentally to outperform other models such as LSTM and GRU on both pathological problems and tasks on natural datasets. (c) 2020 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2020_06_105.pdf 1228KB PDF download
  文献评价指标  
  下载次数:0次 浏览次数:0次