Spelling suggestions: "subject:"constraint qualification"" "subject:"onstraint qualification""
1 
Programação quadratica sequencial e condições de qualificação / Sequential quadratic programming and constraint qualificationNunes, Fernanda Téles 03 September 2009 (has links)
Orientador: Maria Aparecida Diniz Ehrhardt / Dissertação (mestrado)  Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica / Made available in DSpace on 20180813T08:54:50Z (GMT). No. of bitstreams: 1
Nunes_FernandaTeles_M.pdf: 2400651 bytes, checksum: 206dfad35642a33d2de362510094e78d (MD5)
Previous issue date: 2009 / Resumo: Abordando problemas de minimização de funções com restrições nos deparamos com as condições de otimalidade e, ainda, com condições de qualificação das restrições. Nosso interesse é o estudo detalhado de várias condições de qualificação, com destaque para a condição de dependência linear positiva constante, e sua influência na convergência de algoritmos de Programação Quadrática Sequencial. A relevância deste estudo está no fato de que resultados de convergência que têm, em suas hipóteses, condições de qualificação fracas são mais fortes que aqueles baseados em condições de qualificação fortes. Experimentos numéricos serão realizados tanto para investigar a eficiência destes métodos na resolução de problemas com diferentes condições de qualificação, quanto para comparar dois diferentes tipos de busca, monótona e nãomonótona. Tentamos confirmar a hipótese de que algoritmos baseados em uma busca nãomonótona atuam contra o Efeito: Maratos, de comum ocorrência na resolução de problemas de minimização através de métodos de Programação Quadrática Sequencial. / Abstract: In the context of constrained optimization problems, we face the optimality conditions and also constraint qualification. Our aim is to study with details several constraint qualification, highlighting the constant positive linear dependence condition, and its influence in Sequential Quadratic Programming algorithms convergence. The relevance of this study is in the fact that convergence results having as hypothesis weak constraints qualification are stronger than those based on stronger constraints qualification. Numerical experiments will be done with the purpose of investigating the efficiency of these methods to solve problems with different constraints qualification and to compare two diferent kinds of line search, monotone and nonmonotone. We want to confirm the hypothesis that algorithms based on a nonmonotone line search act against the Maratos Effect, very common while solving minimization problems through Sequential Quadratic Programming methods. / Mestrado / Mestre em Matemática Aplicada

2 
A decision support system for multiobjective programming problemsRangoaga, Moeti Joseph 11 1900 (has links)
Many concrete problems may be cast in a multiobjective optimisation framework.
The redundancy of existing methods for solving multiobjective programming
problems susceptible to inconsistencies, coupled with the necessity for making in
herent assumptions before using a given method, make it hard for a nonspecialist
to choose a method that ¯ts the situation at hand well. Moreover, using a method
blindly, as suggested by the hammer principle (when you only have a hammer,
you want everything in your hand to be a nail) is an awkward approach at best
and a caricatural one at worst. This brings challenges to the design, development,
implementation and deployment of a Decision Support System able to choose a
method that is appropriate for a given problem and to apply the chosen method
to solve the problem under consideration. The choice of method should be made
according to the structure of the problem and the decision maker's opinion. The
aim here is to embed a sample of methods representing the main multiobjective
programming techniques and to help the decision maker find the most appropriate
method for his problem. / Decisions Sciences / M. Sc. (Operations Research )

3 
A decision support system for multiobjective programming problemsRangoaga, Moeti Joseph 11 1900 (has links)
Many concrete problems may be cast in a multiobjective optimisation framework.
The redundancy of existing methods for solving multiobjective programming
problems susceptible to inconsistencies, coupled with the necessity for making in
herent assumptions before using a given method, make it hard for a nonspecialist
to choose a method that ¯ts the situation at hand well. Moreover, using a method
blindly, as suggested by the hammer principle (when you only have a hammer,
you want everything in your hand to be a nail) is an awkward approach at best
and a caricatural one at worst. This brings challenges to the design, development,
implementation and deployment of a Decision Support System able to choose a
method that is appropriate for a given problem and to apply the chosen method
to solve the problem under consideration. The choice of method should be made
according to the structure of the problem and the decision maker's opinion. The
aim here is to embed a sample of methods representing the main multiobjective
programming techniques and to help the decision maker find the most appropriate
method for his problem. / Decisions Sciences / M. Sc. (Operations Research )

4 
Fenchel dualitybased algorithms for convex optimization problems with applications in machine learning and image restorationHeinrich, André 27 March 2013 (has links) (PDF)
The main contribution of this thesis is the concept of Fenchel duality with a focus on its application in the field of machine learning problems and image restoration tasks. We formulate a general optimization problem for modeling support vector machine tasks and assign a Fenchel dual problem to it, prove weak and strong duality statements as well as necessary and sufficient optimality conditions for that primaldual pair. In addition, several special instances of the general optimization problem are derived for different choices of loss functions for both the regression and the classifification task. The convenience of these approaches is demonstrated by numerically solving several problems. We formulate a general nonsmooth optimization problem and assign a Fenchel dual problem to it. It is shown that the optimal objective values of the primal and the dual one coincide and that the primal problem has an optimal solution under certain assumptions. The dual problem turns out to be nonsmooth in general and therefore a regularization is performed twice to obtain an approximate dual problem that can be solved efficiently via a fast gradient algorithm. We show how an approximate optimal and feasible primal solution can be constructed by means of some sequences of proximal points closely related to the dual iterates. Furthermore, we show that the solution will indeed converge to the optimal solution of the primal for arbitrarily small accuracy. Finally, the support vector regression task is obtained to arise as a particular case of the general optimization problem and the theory is specialized to this problem. We calculate several proximal points occurring when using difffferent loss functions as well as for some regularization problems applied in image restoration tasks. Numerical experiments illustrate the applicability of our approach for these types of problems.

5 
Fenchel dualitybased algorithms for convex optimization problems with applications in machine learning and image restorationHeinrich, André 21 March 2013 (has links)
The main contribution of this thesis is the concept of Fenchel duality with a focus on its application in the field of machine learning problems and image restoration tasks. We formulate a general optimization problem for modeling support vector machine tasks and assign a Fenchel dual problem to it, prove weak and strong duality statements as well as necessary and sufficient optimality conditions for that primaldual pair. In addition, several special instances of the general optimization problem are derived for different choices of loss functions for both the regression and the classifification task. The convenience of these approaches is demonstrated by numerically solving several problems. We formulate a general nonsmooth optimization problem and assign a Fenchel dual problem to it. It is shown that the optimal objective values of the primal and the dual one coincide and that the primal problem has an optimal solution under certain assumptions. The dual problem turns out to be nonsmooth in general and therefore a regularization is performed twice to obtain an approximate dual problem that can be solved efficiently via a fast gradient algorithm. We show how an approximate optimal and feasible primal solution can be constructed by means of some sequences of proximal points closely related to the dual iterates. Furthermore, we show that the solution will indeed converge to the optimal solution of the primal for arbitrarily small accuracy. Finally, the support vector regression task is obtained to arise as a particular case of the general optimization problem and the theory is specialized to this problem. We calculate several proximal points occurring when using difffferent loss functions as well as for some regularization problems applied in image restoration tasks. Numerical experiments illustrate the applicability of our approach for these types of problems.

Page generated in 0.2401 seconds