期刊论文详细信息
NEUROCOMPUTING 卷:73
Improving liquid state machines through iterative refinement of the reservoir
Article
Norton, David1  Ventura, Dan1 
[1] Brigham Young Univ, Dept Comp Sci, Provo, UT 84602 USA
关键词: Spiking neural network;    Liquid state machine;    Recurrent network;   
DOI  :  10.1016/j.neucom.2010.08.005
来源: Elsevier
PDF
【 摘 要 】

Liquid state machines (LSMs) exploit the power of recurrent spiking neural networks (SNNs) without training the SNN. Instead, LSMs randomly generate this network and then use it as a filter for a generic machine learner. Previous research has shown that LSMs can yield competitive results; however, the process can require numerous time consuming epochs before finding a viable filter. We have developed a method for iteratively refining these randomly generated networks, so that the LSM will yield a more effective filter in fewer epochs than the traditional method. We define a new metric for evaluating the quality of a filter before calculating the accuracy of the LSM. The LSM then uses this metric to drive a novel algorithm founded on principals integral to both Hebbian and reinforcement learning. We compare this new method with traditional LSMs across two artificial pattern recognition problems and two simplified problems derived from the TIMIT dataset. Depending on the problem, our method demonstrates improvements in accuracy of from 15 to almost 600%. (C) 2010 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_neucom_2010_08_005.pdf 1487KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:0次