期刊论文详细信息
NEUROCOMPUTING 卷:298
Local Lyapunov exponents of deep echo state networks
Article
Gallicchio, Claudio1  Micheli, Alessio1  Silvestri, Luca1 
[1] Univ Pisa, Dept Comp Sci, Largo Bruno Pontecorvo 3, I-56127 Pisa, Italy
关键词: Reservoir Computing;    Echo state network;    Deep learning;    Deep recurrent neural networks;    Deep echo state network;    Stability analysis;    Lyapunov exponents;   
DOI  :  10.1016/j.neucom.2017.11.073
来源: Elsevier
PDF
【 摘 要 】

The analysis of deep Recurrent Neural Network (RNN) models represents a research area of increasing interest. In this context, the recent introduction of Deep Echo State Networks (DeepESNs) within the Reservoir Computing paradigm, enabled to study the intrinsic properties of hierarchically organized RNN architectures. In this paper we investigate the DeepESN model under a dynamical system perspective, aiming at characterizing the important aspect of stability of layered recurrent dynamics excited by external input signals. To this purpose, we develop a framework based on the study of the local Lyapunov exponents of stacked recurrent models, enabling the analysis and control of the resulting dynamical regimes. The introduced framework is demonstrated on artificial as well as real-world datasets. The results of our analysis on DeepESNs provide interesting insights on the real effect of layering in RNNs. In particular, they show that when recurrent units are organized in layers, then the resulting network intrinsically develops a richer dynamical behavior that is naturally driven closer to the edge of criticality. As confirmed by experiments on the short-term Memory Capacity task, this characterization makes the layered design effective, with respect to the shallow counterpart with the same number of units, especially in tasks that require much in terms of memory. (C) 2018 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2017_11_073.pdf 1422KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:0次