1 
Quadratic approximation methods for constrained nonlinear programmingEu, Jai Hong 05 1900 (has links)
No description available.

2 
The box method for minimizing strictly convex functions over convex setsEdwards, Teresa Dawn 08 1900 (has links)
No description available.

3 
Algorithms for large sparse constrained optimisationHernandez, Marli de Freitas Gomes January 1995 (has links)
No description available.

4 
The quadratically constrained quadratic programVan Voorhis Timothy P. 12 1900 (has links)
No description available.

5 
A quadratic programming approach to the solution of the 01 linear integer programming problemKennington, Jeffery Lynn 08 1900 (has links)
No description available.

6 
Quadratic programming : quantitative analysis and polynomial running time algorithmsBoljunčić, Jadranka January 1987 (has links)
Many problems in economics, statistics and numerical analysis can be formulated as the optimization of a convex quadratic function over a polyhedral set. A polynomial algorithm for solving convex quadratic programming problems was first developed by Kozlov at al. (1979). Tardos (1986) was the first to present a polynomial
algorithm for solving linear programming problems in which the number of arithmetic steps depends only on the size of the numbers in the constraint matrix and is independent of the size of the numbers in the right hand side and the cost coefficients. In the first part of the thesis we extended Tardos' results to strictly convex quadratic programming of the form max {cTx½xTDx : Ax ≤ b, x ≥0} with D being symmetric positive definite matrix. In our algorithm the number of arithmetic steps is independent of c and b but depends on the size of the entries of the matrices A and D.
Another part of the thesis is concerned with proximity and sensitivity of integer and mixedinteger quadratic programs. We have shown that for any optimal solution z̅ for a given separable quadratic integer programming problem there exist an optimal solution x̅ for its continuous relaxation such that / z̅  x̅ / ∞≤n∆(A) where n is the number of variables and ∆(A) is the largest absolute subdeterminant of the integer constraint matrix A . We have further shown that for any feasible solution z, which is not optimal for the separable quadratic integer programming problem, there exists a feasible solution z̅ having greater objective function value and with / z  z̅ / ∞≤n∆(A). Under some additional assumptions the distance between a pair of optimal solutions to the integer quadratic programming
problem with right hand side vectors b and b', respectively, depends linearly on / b — b' / ₁. The extension to the mixedinteger nonseparable quadratic case is also given.
Some sensitivity analysis results for nonlinear integer programming problems are given. We assume that the nonlinear 0 — 1 problem was solved by implicit enumeration and that some small changes have been made in the right hand side or objective function coefficients. We then established what additional information to keep in the implicit enumeration tree, when solving the original problem, in order to provide us with bounds on the optimal value of a perturbed problem. Also, suppose that after solving the original problem to optimality the problem was enlarged by introducing a new 0 — 1 variable, say xn+1. We determined a lower bound on the added objective function coefficients for which the new integer variable xn+1 remains at zero level in the optimal solution for the modified integer nonlinear program. We discuss the extensions to the mixedinteger case as well as to the case when integer variables are not restricted to be 0 or 1. The computational results for an example with quadratic objective function, linear constraints and 0—1 variables are provided.
Finally, we have shown how to replace the objective function of a quadratic program
with 0—1 variables ( by an integer objective function whose size is polynomially bounded by the number of variables) without changing the set of optimal solutions. This was done by making use of the algorithm given by Frank and Tardos (1985) which in turn uses the simultaneous approximation algorithm of Lenstra, Lenstra and Lovász (1982). / Business, Sauder School of / Graduate

7 
Multistage quadratic stochastic programming /Lau, Karen Karman. January 1999 (has links)
Thesis (Ph. D.)University of New South Wales, 1999. / Also available online.

8 
Multistage quadratic stochastic programmingLau, Karen Karman, School of Mathematics, UNSW January 1999 (has links)
Multistage stochastic programming is an important tool in medium to long term planning where there are uncertainties in the data. In this thesis, we consider a special case of multistage stochastic programming in which each subprogram is a convex quadratic program. The results are also applicable if the quadratic objectives are replaced by convex piecewise quadratic functions. Convex piecewise quadratic functions have important application in financial planning problems as they can be used as very flexible risk measures. The stochastic programming problems can be used as multiperiod portfolio planning problems tailored to the need of individual investors. Using techniques from convex analysis and sensitivity analysis, we show that each subproblem of a multistage quadratic stochastic program is a polyhedral piecewise quadratic program with convex Lipschitz objective. The objective of any subproblem is differentiable with Lipschitz gradient if all its descendent problems have unique dual variables, which can be guaranteed if the linear independence constraint qualification is satisfied. Expression for arbitrary elements of the subdifferential and generalized Hessian at a point can be calculated for quadratic pieces that are active at the point. Generalized Newton methods with linesearch are proposed for solving multistage quadratic stochastic programs. The algorithms converge globally. If the piecewise quadratic objective is differentiable and strictly convex at the solution, then convergence is also finite. A generalized Newton algorithm is implemented in Matlab. Numerical experiments have been carried out to demonstrate its effectiveness. The algorithm is tested on random data with 3, 4 and 5 stages with a maximum of 315 scenarios. The algorithm has also been successfully applied to two sets of test data from a capacity expansion problem and a portfolio management problem. Various strategies have been implemented to improve the efficiency of the proposed algorithm. We experimented with trust region methods with different parameters, using an advanced solution from a smaller version of the original problem and sorting the stochastic right hand sides to encourage faster convergence. The numerical results show that the proposed generalized Newton method is a highly accurate and effective method for multistage quadratic stochastic programs. For problems with the same number of stages, solution times increase linearly with the number of scenarios.

9 
Implementation and improvement of quadratic approximation programming on CACHE FLOWTRANBiegler, Lorenz T. January 1979 (has links)
Thesis (M.S.)University of WisconsinMadison. / Typescript. eContent providerneutral record in process. Description based on print version record. Includes bibliographical references (leaves 96100).

10 
Resolution of Ties in Parametric Quadratic ProgrammingWang, Xianzhi January 2004 (has links)
We consider the convex parametric quadratic programming problem when the end of the parametric interval is caused by a multiplicity of possibilities ("ties"). In such cases, there is no clear way for the proper active set to be determined for the parametric analysis to continue. In this thesis, we show that the proper active set may be determined in general by solving a certain nonparametric quadratic programming problem. We simplify the parametric quadratic programming problem with a parameter both in the linear part of the objective function and in the righthand side of the constraints to a quadratic programming without a parameter. We break the analysis into three parts. We first study the parametric quadratic programming problem with a parameter only in the linear part of the objective function, and then a parameter only in the righthand side of the constraints. Each of these special cases is transformed into a quadratic programming problem having no parameters. A similar approach is then applied to the parametric quadratic programming problem having a parameter both in the linear part of the objective function and in the righthand side of the constraints.

Page generated in 0.2119 seconds