AIMS Mathematics | |
A new three-term conjugate gradient algorithm with modified gradient-differences for solving unconstrained optimization problems | |
article | |
Jie Guo1  Zhong Wan2  | |
[1] School of Mathematical Sciences, Changsha Normal University;School of Mathematics and Statistics, Central South University | |
关键词: optimization problems; conjugate gradient method; global convergence; line search; numerical simulation; | |
DOI : 10.3934/math.2023128 | |
学科分类:地球科学(综合) | |
来源: AIMS Press | |
【 摘 要 】
Unconstrained optimization problems often arise from mining of big data and scientific computing. On the basis of a modified gradient-difference, this article aims to present a new three-term conjugate gradient algorithm to efficiently solve unconstrained optimization problems. Compared with the existing nonlinear conjugate gradient algorithms, the search directions in this algorithm are always sufficiently descent independent of any line search, as well as having conjugacy property. Using the standard Wolfe line search, global and local convergence of the proposed algorithm is proved under mild assumptions. Implementing the developed algorithm to solve 750 benchmark test problems available in the literature, it is shown that the numerical performance of this algorithm is remarkable, especially in comparison with that of the other similar efficient algorithms.
【 授权许可】
CC BY
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202302200002494ZK.pdf | 283KB | download |