11 
The generalized inverse dual for quadratically constrained quadratic programsRandolph, William David, January 1974 (has links)
ThesisUniversity of Florida. / Description based on print version record. Typescript. Vita. Bibliography: leaves 107111.

12 
Algorithmtailored error bound conditions and the linear convergence rae of ADMMZeng, Shangzhi 30 October 2017 (has links)
In the literature, error bound conditions have been widely used for studying the linear convergence rates of various firstorder algorithms and the majority of literature focuses on how to sufficiently ensure these error bound conditions, usually posing more assumptions on the model under discussion. In this thesis, we focus on the alternating direction method of multipliers (ADMM), and show that the known error bound conditions for studying ADMM's linear convergence, can indeed be further weakened if the error bound is studied over the specific iterative sequence generated by ADMM. A socalled partial error bound condition, which is tailored for the specific ADMM's iterative scheme and weaker than known error bound conditions in the literature, is thus proposed to derive the linear convergence of ADMM. We further show that this partial error bound condition theoretically justifies the difference if the two primal variables are updated in different orders in implementing ADMM, which had been empirically observed in the literature yet no theory is known so far.

13 
PreConditioners and Relations between Different Measures of Conditioning for Conic Linear SystemsEpelman, Marina A., 1973, Freund, Robert M. 06 1900 (has links)
In recent years, new and powerful research into "condition numbers" for convex optimization has been developed, aimed at capturing the intuitive notion of problem behavior. This research has been shown to be important in studying the efficiency of algorithms, including interiorpoint algorithms, for convex optimization as well as other behavioral characteristics of these problems such as problem geometry, deformation under data perturbation, etc. This paper studies measures of conditioning for a conic linear system of the form (FPd): Ax = b, x E Cx, whose data is d = (A, b). We present a new measure of conditioning, denoted pd, and we show implications of lid for problem geometry and algorithm complexity, and demonstrate that the value of = id is independent of the specific data representation of (FPd). We then prove certain relations among a variety of condition measures for (FPd), including ld, pad, Xd, and C(d). We discuss some drawbacks of using the condition number C(d) as the sole measure of conditioning of a conic linear system, and we then introduce the notion of a "preconditioner" for (FPd) which results in an equivalent formulation (FPj) of (FPd) with a better condition number C(d). We characterize the best such preconditioner and provide an algorithm for constructing an equivalent data instance d whose condition number C(d) is within a known factor of the best possible.

14 
Internal convex programming, orthogonal linear programming, and program generation procedures.Ristroph, John Heard, January 1975 (has links)
Thesis (Ph. D.)Virginia Polytechnic Institute, 1975. / Also available via the Internet.

15 
The design of transmitter/receiver and high speed analog to digital converters in wireless communication systems a convex programming approach /Zhao, Shaohua, January 2008 (has links)
Thesis (Ph. D.)University of Hong Kong, 2009. / Includes bibliographical references (leaves 155165) Also available in print.

16 
The design of transmitter/receiver and high speed analog to digital converters in wireless communication systems : a convex programming approach /Zhao, Shaohua, January 2008 (has links)
Thesis (Ph. D.)University of Hong Kong, 2009. / Includes bibliographical references (leaves 155165) Also available online.

17 
Classical mechanics with dissipative constraintsHarker, Shaun Russell. January 2009 (has links) (PDF)
Thesis (PhD)Montana State UniversityBozeman, 2009. / Typescript. Chairperson, Graduate Committee: Tomas Gedeon. Includes bibliographical references (leaves 234237).

18 
The solution of nonconvex optimization problems by iterative convex programmingMeyer, Robert R. January 1968 (has links)
Thesis (Ph. D.)University of WisconsinMadison, 1968. / Typescript. Vita. eContent providerneutral record in process. Description based on print version record. Includes bibliographical references.

19 
Some operator splitting methods for convex optimizationLi, Xinxin 14 August 2014 (has links)
Many applications arising in various areas can be well modeled as convex optimization models with separable objective functions and linear coupling constraints. Such areas include signal processing, image processing, statistical learning, wireless networks, etc. If these wellstructured convex models are treated as generic models and their separable structures are ignored in algorithmic design, then it is hard to eﬀectively exploit the favorable properties that the objective functions possibly have. Therefore, some operator splitting methods have regained much attention from diﬀerent areas for solving convex optimization models with separable structures in diﬀerent contexts. In this thesis, some new operator splitting methods are proposed for convex optimiza tion models with separable structures. We ﬁrst propose combining the alternating direction method of multiplier with the logarithmicquadratic proximal regulariza tion for a separable monotone variational inequality with positive orthant constraints and propose a new operator splitting method. Then, we propose a proximal version of the strictly contractive PeacemanRachford splitting method, which was recently proposed for the convex minimization model with linear constraints and an objective function in form of the sum of two functions without coupled variables. After that, an operator splitting method suitable for parallel computation is proposed for a convex model whose objective function is the sum of three functions. For the new algorithms, we establish their convergence and estimate their convergence rates measured by the iteration complexity. We also apply the new algorithms to solve some applications arising in the image processing area; and report some preliminary numerical results. Last, we will discuss a particular video processing application and propose a series of new models for background extraction in diﬀerent scenarios; to which some of the new methods are applicable. Keywords: Convex optimization, Operator splitting method, Alternating direction method of multipliers, PeacemanRachford splitting method, Image processing

20 
Improved recurrent neural networks for convex optimization. / CUHK electronic theses & dissertations collectionJanuary 2008 (has links)
Constrained optimization problems arise widely in scientific research and engineering applications. In the past two decades, solving optimization problems using recurrent neural network methods have been extensively investigated due to the advantages of massively parallel operations and rapid convergence. In real applications, neural networks with simple architecture and good performance are desired. However, most existing neural networks have some limitations and disadvantages in the convergence conditions or architecture complexity. This thesis is concentrated on analysis and design of recurrent neural networks with simplified architecture but for solving more general convex optimization problems. In this thesis, some improved recurrent neural networks have been proposed for solving smooth and nonsmooth convex optimization problems and applied to some selected applications. / In Part I, we first propose a onelayer recurrent neural network for solving linear programming problems. Compared with other neural networks for linear programming, the proposed neural network has simpler architecture and better convergence properties. Second, a onelayer recurrent neural network is proposed for solving quadratic programming problems. The global convergence of the neural network can be guaranteed if only the objective function of the programming problem is convex on the equality constraints and not necessarily convex everywhere. Compared with the other neural networks for quadratic programming, such as the Lagrangian network and projection neural network, the proposed neural network has simpler architecture which neurons is the same as the number of the optimization problems. Third, combining the projection and penalty parameter methods, a onelayer recurrent neural network is proposed for solving general convex optimization problems with linear constraints. / In Part II, some improved recurrent neural networks are proposed for solving nonsmooth convex optimization problems. We first proposed a onelayer recurrent neural network for solving the nonsmooth convex programming problems with only equality constraints. This neural network simplifies the Lagrangian network and extend the neural network to solve nonsmooth convex optimization problems. Then, a twolayers recurrent neural network is proposed for the nonsmooth convex optimization subject to linear equality and bound constraints. / In Part III, some selected applications of the proposed neural networks are also discussed. The kwinnerstakeall (kWTA) operation is first converted to equivalent linear and quadratic optimization problems and two kWTA network models are tailed to do the kWTA operation. Then, the proposed neural networks are applied to some other problems, such as the linear assignment, support vector machine learning and curve fitting problems. / Liu, Qingshan. / Source: Dissertation Abstracts International, Volume: 7006, Section: B, page: 3606. / Thesis (Ph.D.)Chinese University of Hong Kong, 2008. / Includes bibliographical references (leaves 133145). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.

Page generated in 0.1098 seconds