学位论文详细信息
A reduction framework for approximate extended formulations and a faster algorithm for convex optimization
Extended formulations;Linear programming;Semidefinite programming;Approximations;Convex optimization;Frank-Wolfe method;Conditional gradients
Zink, Daniel ; Pokutta, Sebastian Industrial and Systems Engineering Blekherman, Grigoriy Dey, Santanu S. Lan, Guanghui Vempala, Santosh ; Pokutta, Sebastian
University:Georgia Institute of Technology
Department:Industrial and Systems Engineering
关键词: Extended formulations;    Linear programming;    Semidefinite programming;    Approximations;    Convex optimization;    Frank-Wolfe method;    Conditional gradients;   
Others  :  https://smartech.gatech.edu/bitstream/1853/58274/1/ZINK-DISSERTATION-2017.PDF
美国|英语
来源: SMARTech Repository
PDF
【 摘 要 】

Linear programming (LP) and semidefinite programming (SDP) are among the most important tools in Operations Research and Computer Science. In this work we study the limitations of LPs and SDPs by providing lower bounds on the size of (approximate) linear and semidefinite programming formulations of combinatorial optimization problems. The hardness of (approximate) linear optimization implied by these lower bounds motivates the lazification technique for conditional gradient type algorithms. This technique allows us to replace (approximate) linear optimization in favor of a much weaker subroutine, achieving significant performance improvement in practice. We can summarize the main contributions as follows: (i) Reduction framework for LPs and SDPs: We present a new view on extended formulations that does not require an initial encoding of a combinatorial problem as a linear or semidefinite program. This new view allows us to define a purely combinatorial reduction framework transferring lower bounds on the size of exact and approximate LP and SDP formulations between different problems. Using our framework we show new LP and SDP lower bounds for a large variety of problems including Vertex Cover, various (binary and non-binary) constraint satisfaction problems as well as multiple optimization versions of Graph-Isomorphism. (ii) Lazification technique for Conditional Gradient algorithms: In Convex Programming conditional gradient type algorithms (also known as Frank-Wolfe type methods) are very important in theory as well as in practice due to their simplicity and fast convergence. We show how we can eliminate the linear optimization step performed in every iteration of Frank-Wolfe type methods and instead use a weak separation oracle. This oracle is significantly faster in practice and enables caching for additional improvements in speed and the sparsity of the obtained solutions.

【 预 览 】
附件列表
Files Size Format View
A reduction framework for approximate extended formulations and a faster algorithm for convex optimization 2457KB PDF download
  文献评价指标  
  下载次数:13次 浏览次数:36次