学位论文详细信息
Information, complexity and structure in convex optimization
Convex optimization;Optimization algorithms;Complexity theory;Lower bounds
Guzman Paredes, Cristobal ; Nemirovski, Arkadi Pokutta, Sebastian Industrial and Systems Engineering Ahmed, Shabbir d'Aspremont, Alexandre Vempala, Santosh ; Nemirovski, Arkadi
University:Georgia Institute of Technology
Department:Industrial and Systems Engineering
关键词: Convex optimization;    Optimization algorithms;    Complexity theory;    Lower bounds;   
Others  :  https://smartech.gatech.edu/bitstream/1853/53577/1/GUZMANPAREDES-DISSERTATION-2015.pdf
美国|英语
来源: SMARTech Repository
PDF
【 摘 要 】
This thesis is focused on the limits of performance of large-scale convex optimization algorithms. Classical theory of oracle complexity, first proposed by Nemirovski and Yudin in 1983, successfully established the worst-case behavior of methods based on local oracles (a generalization of first-order oracle for smooth functions) for nonsmooth convex minimization, both in the large-scale and low-scale regimes; and the complexity of approximately solving linear systems of equations (equivalent to convex quadratic minimization) over Euclidean balls, under a matrix-vector multiplication oracle.Our work extends the applicability of lower bounds in two directions:Worst-Case Complexity of Large-Scale Smooth Convex Optimization: We generalize lower bounds on the complexity of first-order methods for convex optimization, considering classes of convex functions with Hölder continuous gradients. Our technique relies on the existence of a smoothing kernel, which defines a smooth approximation for any convex function via infimal convolution. As a consequence, we derive lower bounds for \ell_p/\ell_q-setups, where 1\leq p,q\leq \infty, and extend to its matrix analogue: Smooth convex minimization (with respect to the Schatten q-norm) over matrices with bounded Schatten p-norm.The major consequences of this result are the near-optimality of the Conditional Gradient method over box-type domains (p=q=\infty), and the near-optimality of Nesterov's accelerated method over the cross-polytope (p=q=1).Distributional Complexity of Nonsmooth Convex Optimization: In this work, we prove average-case lower bounds for the complexity of nonsmooth convexptimization. We introduce an information-theoretic method to analyze the complexity oforacle-based algorithms solving a random instance, based on the reconstruction principle.Our technique shows that all known lower bounds for nonsmooth convexoptimization can be derived by anemulation procedure from a commonString-Guessing Problem, which is combinatorial in nature. The derived average-case lower bounds extend to hold with high probability, and for algorithms with bounded probability error, via Fano's inequality.Finally, from the proposed technique we establish the equivalence (up to constant factors) of distributional, randomized, and worst-case complexity for black-box convex optimization. In particular, there is no gain from randomization in this setup.
【 预 览 】
附件列表
Files Size Format View
Information, complexity and structure in convex optimization 538KB PDF download
  文献评价指标  
  下载次数:9次 浏览次数:18次