Spelling suggestions: "subject:"convex programming."" "subject:"konvex programming.""
11 
Convergence and approximation for primaldual methods in largescale optimization /Wright, Stephen E., January 1990 (has links)
Thesis (Ph. D.)University of Washington, 1990. / Vita. Includes bibliographical references (leaves [97]100).

12 
The generalized inverse dual for quadratically constrained quadratic programsRandolph, William David, January 1974 (has links)
ThesisUniversity of Florida. / Description based on print version record. Typescript. Vita. Bibliography: leaves 107111.

13 
Some operator splitting methods for convex optimizationLi, Xinxin 14 August 2014 (has links)
Many applications arising in various areas can be well modeled as convex optimization models with separable objective functions and linear coupling constraints. Such areas include signal processing, image processing, statistical learning, wireless networks, etc. If these wellstructured convex models are treated as generic models and their separable structures are ignored in algorithmic design, then it is hard to eﬀectively exploit the favorable properties that the objective functions possibly have. Therefore, some operator splitting methods have regained much attention from diﬀerent areas for solving convex optimization models with separable structures in diﬀerent contexts. In this thesis, some new operator splitting methods are proposed for convex optimiza tion models with separable structures. We ﬁrst propose combining the alternating direction method of multiplier with the logarithmicquadratic proximal regulariza tion for a separable monotone variational inequality with positive orthant constraints and propose a new operator splitting method. Then, we propose a proximal version of the strictly contractive PeacemanRachford splitting method, which was recently proposed for the convex minimization model with linear constraints and an objective function in form of the sum of two functions without coupled variables. After that, an operator splitting method suitable for parallel computation is proposed for a convex model whose objective function is the sum of three functions. For the new algorithms, we establish their convergence and estimate their convergence rates measured by the iteration complexity. We also apply the new algorithms to solve some applications arising in the image processing area; and report some preliminary numerical results. Last, we will discuss a particular video processing application and propose a series of new models for background extraction in diﬀerent scenarios; to which some of the new methods are applicable. Keywords: Convex optimization, Operator splitting method, Alternating direction method of multipliers, PeacemanRachford splitting method, Image processing

14 
Internal convex programming, orthogonal linear programming, and program generation procedures.Ristroph, John Heard, January 1975 (has links)
Thesis (Ph. D.)Virginia Polytechnic Institute, 1975. / Also available via the Internet.

15 
The design of transmitter/receiver and high speed analog to digital converters in wireless communication systems a convex programming approach /Zhao, Shaohua, January 2008 (has links)
Thesis (Ph. D.)University of Hong Kong, 2009. / Includes bibliographical references (leaves 155165) Also available in print.

16 
The design of transmitter/receiver and high speed analog to digital converters in wireless communication systems : a convex programming approach /Zhao, Shaohua, January 2008 (has links)
Thesis (Ph. D.)University of Hong Kong, 2009. / Includes bibliographical references (leaves 155165) Also available online.

17 
Classical mechanics with dissipative constraintsHarker, Shaun Russell. January 2009 (has links) (PDF)
Thesis (PhD)Montana State UniversityBozeman, 2009. / Typescript. Chairperson, Graduate Committee: Tomas Gedeon. Includes bibliographical references (leaves 234237).

18 
The solution of nonconvex optimization problems by iterative convex programmingMeyer, Robert R. January 1968 (has links)
Thesis (Ph. D.)University of WisconsinMadison, 1968. / Typescript. Vita. eContent providerneutral record in process. Description based on print version record. Includes bibliographical references.

19 
PreConditioners and Relations between Different Measures of Conditioning for Conic Linear SystemsEpelman, Marina A., 1973, Freund, Robert M. 06 1900 (has links)
In recent years, new and powerful research into "condition numbers" for convex optimization has been developed, aimed at capturing the intuitive notion of problem behavior. This research has been shown to be important in studying the efficiency of algorithms, including interiorpoint algorithms, for convex optimization as well as other behavioral characteristics of these problems such as problem geometry, deformation under data perturbation, etc. This paper studies measures of conditioning for a conic linear system of the form (FPd): Ax = b, x E Cx, whose data is d = (A, b). We present a new measure of conditioning, denoted pd, and we show implications of lid for problem geometry and algorithm complexity, and demonstrate that the value of = id is independent of the specific data representation of (FPd). We then prove certain relations among a variety of condition measures for (FPd), including ld, pad, Xd, and C(d). We discuss some drawbacks of using the condition number C(d) as the sole measure of conditioning of a conic linear system, and we then introduce the notion of a "preconditioner" for (FPd) which results in an equivalent formulation (FPj) of (FPd) with a better condition number C(d). We characterize the best such preconditioner and provide an algorithm for constructing an equivalent data instance d whose condition number C(d) is within a known factor of the best possible.

20 
Portfolio Selection Under Nonsmooth Convex Transaction CostsPotaptchik, Marina January 2006 (has links)
We consider a portfolio selection problem in the presence of transaction costs. Transaction costs on each asset are assumed to be a convex function of the amount sold or bought. This function can be nondifferentiable in a finite number of points. The objective function of this problem is a sum of a convex twice differentiable function and a separable convex nondifferentiable function. We first consider the problem in the presence of linear constraints and later generalize the results to the case when the constraints are given by the convex piecewise linear functions. <br /><br /> Due to the special structure, this problem can be replaced by an equivalent differentiable problem in a higher dimension. It's main drawback is efficiency since the higher dimensional problem is computationally expensive to solve. <br /><br /> We propose several alternative ways to solve this problem which do not require introducing new variables or constraints. We derive the optimality conditions for this problem using subdifferentials. First, we generalize an active set method to this class of problems. We solve the problem by considering a sequence of equality constrained subproblems, each subproblem having a twice differentiable objective function. Information gathered at each step is used to construct the subproblem for the next step. We also show how the nonsmoothness can be handled efficiently by using spline approximations. The problem is then solved using a primaldual interiorpoint method. <br /><br /> If a higher accuracy is needed, we do a crossover to an active set method. Our numerical tests show that we can solve large scale problems efficiently and accurately.

Page generated in 0.0846 seconds