期刊论文详细信息
PATTERN RECOGNITION 卷:38
Algorithms and networks for accelerated convergence of adaptive LDA
Article
Moghaddam, HA ; Matinfar, M ; Sadough, SMS ; Zadeh, KA
关键词: adaptive linear discriminant analysis;    adaptive principal component analysis;    gradient descent optimization;    steepest descent optimization;    conjugate direction optimization;    Newton-Raphson optimization;    self-organizing neural network;    convergence analysis;   
DOI  :  10.1016/j.patcog.2004.07.003
来源: Elsevier
PDF
【 摘 要 】

We introduce and discuss new accelerated algorithms for linear discriminant analysis (LDA) in unimodal multiclass, Gausssian data. These algorithms use a variable step size, optimally computed in each iteration using (i) the steepest descent, (ii) conjugate direction, and (iii) Newton-Raphson methods in order to accelerate the convergence of the algorithm. Current adaptive methods based on the gradient descent optimization technique use a fixed or a monotonically decreasing step size in each iteration. which results in a slow convergence rate. Furthermore, the convergence of these algorithms depends on appropriate choices of the step sizes. The new algorithms have the advantage of automatic optimal selection of the step size using the current data samples. Based on the new adaptive algorithms, we present self-organizing neural networks for adaptive computation of Sigma(-1/2) and use them in cascaded form with a PCA network for LDA. Experimental results demonstrate fast convergence and robustness of the new algorithms and justify their advantages for on-line pattern recognition applications with stationary and non-stationary multidimensional input data. (C) 2004 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_patcog_2004_07_003.pdf 332KB PDF download
  文献评价指标  
  下载次数:0次 浏览次数:0次