学位论文详细信息
Learning algorithms for neural networks
backpropagation;clustering;convergence of neural networks;Hopfield model;oscillations;oscillations of neural networks;recurrent neural networks;stability of neural networks;training algorithms;unsupervised learning
Atiya, Amir ; Abu-Mostafa, Yaser S.
University:California Institute of Technology
Department:Engineering and Applied Science
关键词: backpropagation;    clustering;    convergence of neural networks;    Hopfield model;    oscillations;    oscillations of neural networks;    recurrent neural networks;    stability of neural networks;    training algorithms;    unsupervised learning;   
Others  :  https://thesis.library.caltech.edu/3725/1/Atiya_a_1991.pdf
美国|英语
来源: Caltech THESIS
PDF
【 摘 要 】

This thesis deals mainly with the development of new learning algorithms and the study of the dynamics of neural networks. We develop a method for training feedback neural networks. Appropriate stability conditions are derived, and learning is performed by the gradient descent technique. We develop a new associative memory model using Hopfield's continuous feedback network. We demonstrate some of the storage limitations of the Hopfield network, and develop alternative architectures and an algorithm for designing the associative memory. We propose a new unsupervised learning method for neural networks. The method is based on applying repeatedly the gradient ascent technique on a defined criterion function. We study some of the dynamical aspects of Hopfield networks. New stability results are derived. Oscillations and synchronizations in several architectures are studied, and related to recent findings in biology. The problem of recording the outputs of real neural networks is considered. A new method for the detection and the recognition of the recorded neural signals is proposed.

【 预 览 】
附件列表
Files Size Format View
Learning algorithms for neural networks 5960KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:14次