期刊论文详细信息
A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization
Article; Early Access
关键词: BFGS METHOD;    GLOBAL CONVERGENCE;    ALGORITHM;    DESCENT;    STEPSIZE;   
DOI  :  10.1007/s11075-023-01559-0
来源: SCIE
【 摘 要 】

In this paper, based on the limited memory techniques and subspace minimization conjugate gradient (SMCG) methods, a regularized limited memory subspace minimization conjugate gradient method is proposed, which contains two types of iterations. In SMCG iteration, we obtain the search direction by minimizing the approximate quadratic model or approximate regularization model. In RQN iteration, combined with regularization technique and BFGS method, a modified regularized quasi-Newton method is used in the subspace to improve the orthogonality. Moreover, some simple acceleration criteria and an improved tactic for selecting the initial stepsize to enhance the efficiency of the algorithm are designed. Additionally, a generalized nonmonotone line search is utilized and the global convergence of our proposed algorithm is established under mild conditions. Finally, numerical results show that the proposed algorithm has a significant improvement over ASMCG_PR and is superior to the particularly well-known limited memory conjugate gradient software packages CG_DESCENT (6.8) and CGOPT(2.0) for the CUTEr library.

【 授权许可】

Free   

  文献评价指标  
  下载次数:0次 浏览次数:2次