Al-Rafidain Journal of Computer Sciences and Mathematics | |
An Efficient Line Search Algorithm for Large Scale Optimization | |
Abbas Al-Bayati1  Ivan Latif2  | |
[1] College of Computer Sciences and Mathematics University of Mosul, Mosul, Iraq;College of Scientific Education University of Salahaddin; | |
关键词: unconstrained optimization; line search; biggs variable metric update; gradient descent algorithm; | |
DOI : 10.33899/csmj.2010.163845 | |
来源: DOAJ |
【 摘 要 】
In this work we present a new algorithm of gradient descent type, in which the stepsize is computed by means of simple approximation of the Hessian Matrix to solve nonlinear unconstrained optimization function. The new proposed algorithm considers a new approximation of the Hessian based on the function values and its gradients in two successive points along the iterations one of them use Biggs modified formula to locate the new points. The corresponding algorithm belongs to the same class of superlinear convergent descent algorithms and it has been newly programmed to obtain the numerical results for a selected class of nonlinear test functions with various dimensions. Numerical experiments show that the new choice of the step-length required less computation work and greatly speeded up the convergence of the gradient algorithm especially, for large scaled unconstrained optimization problems.
【 授权许可】
Unknown