期刊论文详细信息
JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS 卷:261
Two modified scaled nonlinear conjugate gradient methods
Article
Babaie-Kafaki, Saman
关键词: Unconstrained optimization;    Scaled nonlinear conjugate gradient method;    BFGS update;    Modified secant equation;    Sufficient descent condition;    Global convergence;   
DOI  :  10.1016/j.cam.2013.11.001
来源: Elsevier
PDF
【 摘 要 】

Following the scaled conjugate gradient methods proposed by Andrei, we hybridize the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martinez based on a modified secant equation suggested by Yuan, and propose two modified scaled conjugate gradient methods. The interesting features of our methods are applying the function values in addition to the gradient values and satisfying the sufficient descent condition for the generated search directions which leads to the global convergence for uniformly convex functions. Numerical comparisons between the implementations of one of our methods which generates descent search directions for general functions and an efficient scaled conjugate gradient method proposed by Andrei are made on a set of unconstrained optimization test problems from the CUTEr collection, using the performance profile introduced by Dolan and More. (C) 2013 Elsevier B.V. All rights reserved.

【 授权许可】

Free   

【 预 览 】
附件列表
Files Size Format View
10_1016_j_cam_2013_11_001.pdf 530KB PDF download
  文献评价指标  
  下载次数:0次 浏览次数:1次