Return to search

Convex relaxations in nonconvex and applied optimization

Traditionally, linear programming (LP) has been used to construct convex relaxations in the context of branch and bound for determining global optimal solutions to nonconvex optimization problems. As second-order cone programming (SOCP) and semidefinite programming (SDP) become better understood by optimization researchers, they become alternative choices for obtaining convex relaxations and producing bounds on the optimal values. In this thesis, we study the use of these convex optimization tools in constructing strong relaxations for several nonconvex problems, including 0-1 integer programming, nonconvex box-constrained quadratic programming (BoxQP), and general quadratic programming (QP).
We first study a SOCP relaxation for 0-1 integer programs and a sequential relaxation technique based on this SOCP relaxation. We present desirable properties of this SOCP relaxation, for example, this relaxation cuts off all fractional extreme points of the regular LP relaxation. We further prove that the sequential relaxation technique generates the convex hull of 0-1 solutions asymptotically.
We next explore nonconvex quadratic programming. We propose a SDP relaxation for BoxQP based on relaxing the first- and second-order KKT conditions, where the difficulty and contribution lie in relaxing the second-order KKT condition. We show that, although the relaxation we obtain this way is equivalent to an existing SDP relaxation at the root node, it is significantly stronger on the children nodes in a branch-and-bound setting.
New advance in optimization theory allows one to express QP as optimizing a linear function over the convex cone of completely positive matrices subject to linear constraints, referred to as completely positive programming (CPP). CPP naturally admits strong semidefinite relaxations. We incorporate the first-order KKT conditions of QP into the constraints of QP, and then pose it in the form of CPP to obtain a strong relaxation. We employ the resulting SDP relaxation inside a finite branch-and-bound algorithm to solve the QP. Comparison of our algorithm with commercial global solvers shows potential as well as room for improvement.
The remainder is devoted to new techniques for solving a class of large-scale linear programming problems. First order methods, although not as fast as second-order methods, are extremely memory efficient. We develop a first-order method based on Nesterov's smoothing technique and demonstrate the effectiveness of our method on two machine learning problems.

Identiferoai:union.ndltd.org:uiowa.edu/oai:ir.uiowa.edu:etd-1839
Date01 July 2010
CreatorsChen, Jieqiu
ContributorsBurer, Samuel A.
PublisherUniversity of Iowa
Source SetsUniversity of Iowa
LanguageEnglish
Detected LanguageEnglish
Typedissertation
Formatapplication/pdf
SourceTheses and Dissertations
RightsCopyright 2010 Jieqiu Chen

Page generated in 0.0023 seconds