This dissertation considers the composite saddle-point (CSP) problem which is motivated by real-world applications in the areas of machine learning and image processing. Two new accelerated algorithms for solving composite saddle-point problems are introduced.Due to the two-block structure of the CSP problem, it can be solved by any algorithm belonging to the block-decomposition hybrid proximal extragradient (BD-HPE) framework. The framework consists of a family of inexact proximal point methods for solving a general two-block structured monotone inclusion problem which, at every iteration, solves two prox sub-inclusions according to a certain relative error criterion. By exploiting the fact that the two prox sub-inclusions in the context of the CSP problem are equivalent to two composite convex programs, the first part of this dissertation proposes a new instance of the BD-HPE framework that approximately solves them using an accelerated gradient method. It is shown that this new instance has better iteration-complexity than the previous ones.The second part of this dissertation introduces a new algorithm for solving a special class of CSP problems. The new algorithm is a special instance of the hybrid proximal extragradient (HPE) framework in which a Nesterov's accelerated variant is used to approximately solve the prox subproblems. One of the advantages of the this method is that it works for any constant choice of proximal stepsize. Moreover, a suitable choice of the latter stepsize yields a method with the best known (accelerated inner) iteration complexity for the aforementioned class of saddle-point problems. Experiment results on both synthetic CSP problems and real-world problems show that the two method significantly outperform several state-of-the-art algorithms.
【 预 览 】
附件列表
Files
Size
Format
View
Accelerated algorithms for composite saddle-point problems and applications