期刊论文详细信息
PHYSICA D-NONLINEAR PHENOMENA 卷:414
Memory and forecasting capacities of nonlinear recurrent networks
Article
Gonon, Lukas1  Grigoryeva, Lyudmila2  Ortega, Juan-Pablo3,4 
[1] Ludwig Maximilians Univ Munchen, Fac Math Informat & Stat, Theresienstr 39, D-80333 Munich, Germany
[2] Univ Konstanz, Dept Math & Stat, Univ Str 10, D-78457 Constance, Germany
[3] Univ Sankt Gallen, Fac Math & Stat, Bodanstr 6, CH-9000 St Gallen, Switzerland
[4] Ctr Natl Rech Sci CNRS, Paris, France
关键词: Memory capacity;    Forecasting capacity;    Recurrent neural network;    Reservoir computing;    Echo state network (ESN);    Machine learning;   
DOI  :  10.1016/j.physd.2020.132721
来源: Elsevier
PDF
【 摘 要 】

The notion of memory capacity, originally introduced for echo state and linear networks with independent inputs, is generalized to nonlinear recurrent networks with stationary but dependent inputs. The presence of dependence in the inputs makes natural the introduction of the network forecasting capacity, that measures the possibility of forecasting time series values using network states. Generic bounds for memory and forecasting capacities are formulated in terms of the number of neurons of the nonlinear recurrent network and the autocovariance function or the spectral density of the input. These bounds generalize well-known estimates in the literature to a dependent inputs setup. Finally, for the particular case of linear recurrent networks with independent inputs it is proved that the memory capacity is given by the rank of the associated controllability matrix, a fact that has been for a long time assumed to be true without proof by the community. (c) 2020 The Authors. Published by Elsevier B.V.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_physd_2020_132721.pdf 791KB PDF download
  文献评价指标  
  下载次数:8次 浏览次数:0次