期刊论文详细信息
Journal of inequalities and applications
On convergence and complexity analysis of an accelerated forward–backward algorithm with linesearch technique for convex minimization problems and applications to data prediction and classification
Panitarn Sarnmeta1  Dawan Chumpungam1  Suthep Suantai2  Warunun Inthakon2 
[1] Data Science Research Center, Department of Mathematics, Faculty of Science, Chiang Mai University, 50200, Chiang Mai, Thailand;Data Science Research Center, Department of Mathematics, Faculty of Science, Chiang Mai University, 50200, Chiang Mai, Thailand;Research Center in Mathematics and Applied Mathematics, Department of Mathematics, Faculty of Science, Chiang Mai University, 50200, Chiang Mai, Thailand;
关键词: Convex minimization problems;    Machine learning;    Forward–backward algorithm;    Linesearch;    Accelerated algorithm;    Data classification;   
DOI  :  10.1186/s13660-021-02675-y
来源: Springer
PDF
【 摘 要 】

In this work, we introduce a new accelerated algorithm using a linesearch technique for solving convex minimization problems in the form of a summation of two lower semicontinuous convex functions. A weak convergence of the proposed algorithm is given without assuming the Lipschitz continuity on the gradient of the objective function. Moreover, the convexity of this algorithm is also analyzed. Some numerical experiments in machine learning are also discussed, namely regression and classification problems. Furthermore, in our experiments, we evaluate the convergent behavior of this new algorithm, then compare it with various algorithms mentioned in the literature. It is found that our algorithm performs better than the others.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202109170423041ZK.pdf 1835KB PDF download
  文献评价指标  
  下载次数:6次 浏览次数:16次