• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • Tagged with
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

New insights into conjugate duality

Grad, Sorin - Mihai 19 July 2006 (has links) (PDF)
With this thesis we bring some new results and improve some existing ones in conjugate duality and some of the areas it is applied in. First we recall the way Lagrange, Fenchel and Fenchel - Lagrange dual problems to a given primal optimization problem can be obtained via perturbations and we present some connections between them. For the Fenchel - Lagrange dual problem we prove strong duality under more general conditions than known so far, while for the Fenchel duality we show that the convexity assumptions on the functions involved can be weakened without altering the conclusion. In order to prove the latter we prove also that some formulae concerning conjugate functions given so far only for convex functions hold also for almost convex, respectively nearly convex functions. After proving that the generalized geometric dual problem can be obtained via perturbations, we show that the geometric duality is a special case of the Fenchel - Lagrange duality and the strong duality can be obtained under weaker conditions than stated in the existing literature. For various problems treated in the literature via geometric duality we show that Fenchel - Lagrange duality is easier to apply, bringing moreover strong duality and optimality conditions under weaker assumptions. The results presented so far are applied also in convex composite optimization and entropy optimization. For the composed convex cone - constrained optimization problem we give strong duality and the related optimality conditions, then we apply these when showing that the formula of the conjugate of the precomposition with a proper convex K - increasing function of a K - convex function on some n - dimensional non - empty convex set X, where K is a k - dimensional non - empty closed convex cone, holds under weaker conditions than known so far. Another field were we apply these results is vector optimization, where we provide a general duality framework based on a more general scalarization that includes as special cases and improves some previous results in the literature. Concerning entropy optimization, we treat first via duality a problem having an entropy - like objective function, from which arise as special cases some problems found in the literature on entropy optimization. Finally, an application of entropy optimization into text classification is presented.
2

Farkas - type results for convex and non - convex inequality systems

Hodrea, Ioan Bogdan 22 January 2008 (has links) (PDF)
As the title already suggests the aim of the present work is to present Farkas - type results for inequality systems involving convex and/or non - convex functions. To be able to give the desired results, we treat optimization problems which involve convex and composed convex functions or non - convex functions like DC functions or fractions. To be able to use the fruitful Fenchel - Lagrange duality approach, to the primal problem we attach an equivalent problem which is a convex optimization problem. After giving a dual problem to the problem we initially treat, we provide weak necessary conditions which secure strong duality, i.e., the case when the optimal objective value of the primal problem coincides with the optimal objective value of the dual problem and, moreover, the dual problem has an optimal solution. Further, two ideas are followed. Firstly, using the weak and strong duality between the primal problem and the dual problem, we are able to give necessary and sufficient optimality conditions for the optimal solutions of the primal problem. Secondly, provided that no duality gap lies between the primal problem and its Fenchel - Lagrange - type dual we are able to demonstrate some Farkas - type results and thus to underline once more the connections between the theorems of the alternative and the theory of duality. One statement of the above mentioned Farkas - type results is characterized using only epigraphs of functions. We conclude our investigations by providing necessary and sufficient optimality conditions for a multiobjective programming problem involving composed convex functions. Using the well-known linear scalarization to the primal multiobjective program a family of scalar optimization problems is attached. Further to each of these scalar problems the Fenchel - Lagrange dual problem is determined. Making use of the weak and strong duality between the scalarized problem and its dual the desired optimality conditions are proved. Moreover, the way the dual problem of the scalarized problem looks like gives us an idea about how to construct a vector dual problem to the initial one. Further weak and strong vector duality assertions are provided.
3

New insights into conjugate duality

Grad, Sorin - Mihai 13 July 2006 (has links)
With this thesis we bring some new results and improve some existing ones in conjugate duality and some of the areas it is applied in. First we recall the way Lagrange, Fenchel and Fenchel - Lagrange dual problems to a given primal optimization problem can be obtained via perturbations and we present some connections between them. For the Fenchel - Lagrange dual problem we prove strong duality under more general conditions than known so far, while for the Fenchel duality we show that the convexity assumptions on the functions involved can be weakened without altering the conclusion. In order to prove the latter we prove also that some formulae concerning conjugate functions given so far only for convex functions hold also for almost convex, respectively nearly convex functions. After proving that the generalized geometric dual problem can be obtained via perturbations, we show that the geometric duality is a special case of the Fenchel - Lagrange duality and the strong duality can be obtained under weaker conditions than stated in the existing literature. For various problems treated in the literature via geometric duality we show that Fenchel - Lagrange duality is easier to apply, bringing moreover strong duality and optimality conditions under weaker assumptions. The results presented so far are applied also in convex composite optimization and entropy optimization. For the composed convex cone - constrained optimization problem we give strong duality and the related optimality conditions, then we apply these when showing that the formula of the conjugate of the precomposition with a proper convex K - increasing function of a K - convex function on some n - dimensional non - empty convex set X, where K is a k - dimensional non - empty closed convex cone, holds under weaker conditions than known so far. Another field were we apply these results is vector optimization, where we provide a general duality framework based on a more general scalarization that includes as special cases and improves some previous results in the literature. Concerning entropy optimization, we treat first via duality a problem having an entropy - like objective function, from which arise as special cases some problems found in the literature on entropy optimization. Finally, an application of entropy optimization into text classification is presented.
4

Farkas - type results for convex and non - convex inequality systems

Hodrea, Ioan Bogdan 13 December 2007 (has links)
As the title already suggests the aim of the present work is to present Farkas - type results for inequality systems involving convex and/or non - convex functions. To be able to give the desired results, we treat optimization problems which involve convex and composed convex functions or non - convex functions like DC functions or fractions. To be able to use the fruitful Fenchel - Lagrange duality approach, to the primal problem we attach an equivalent problem which is a convex optimization problem. After giving a dual problem to the problem we initially treat, we provide weak necessary conditions which secure strong duality, i.e., the case when the optimal objective value of the primal problem coincides with the optimal objective value of the dual problem and, moreover, the dual problem has an optimal solution. Further, two ideas are followed. Firstly, using the weak and strong duality between the primal problem and the dual problem, we are able to give necessary and sufficient optimality conditions for the optimal solutions of the primal problem. Secondly, provided that no duality gap lies between the primal problem and its Fenchel - Lagrange - type dual we are able to demonstrate some Farkas - type results and thus to underline once more the connections between the theorems of the alternative and the theory of duality. One statement of the above mentioned Farkas - type results is characterized using only epigraphs of functions. We conclude our investigations by providing necessary and sufficient optimality conditions for a multiobjective programming problem involving composed convex functions. Using the well-known linear scalarization to the primal multiobjective program a family of scalar optimization problems is attached. Further to each of these scalar problems the Fenchel - Lagrange dual problem is determined. Making use of the weak and strong duality between the scalarized problem and its dual the desired optimality conditions are proved. Moreover, the way the dual problem of the scalarized problem looks like gives us an idea about how to construct a vector dual problem to the initial one. Further weak and strong vector duality assertions are provided.
5

Application of the Duality Theory

Lorenz, Nicole 15 August 2012 (has links) (PDF)
The aim of this thesis is to present new results concerning duality in scalar optimization. We show how the theory can be applied to optimization problems arising in the theory of risk measures, portfolio optimization and machine learning. First we give some notations and preliminaries we need within the thesis. After that we recall how the well-known Lagrange dual problem can be derived by using the general perturbation theory and give some generalized interior point regularity conditions used in the literature. Using these facts we consider some special scalar optimization problems having a composed objective function and geometric (and cone) constraints. We derive their duals, give strong duality results and optimality condition using some regularity conditions. Thus we complete and/or extend some results in the literature especially by using the mentioned regularity conditions, which are weaker than the classical ones. We further consider a scalar optimization problem having single chance constraints and a convex objective function. We also derive its dual, give a strong duality result and further consider a special case of this problem. Thus we show how the conjugate duality theory can be used for stochastic programming problems and extend some results given in the literature. In the third chapter of this thesis we consider convex risk and deviation measures. We present some more general measures than the ones given in the literature and derive formulas for their conjugate functions. Using these we calculate some dual representation formulas for the risk and deviation measures and correct some formulas in the literature. Finally we proof some subdifferential formulas for measures and risk functions by using the facts above. The generalized deviation measures we introduced in the previous chapter can be used to formulate some portfolio optimization problems we consider in the fourth chapter. Their duals, strong duality results and optimality conditions are derived by using the general theory and the conjugate functions, respectively, given in the second and third chapter. Analogous calculations are done for a portfolio optimization problem having single chance constraints using the general theory given in the second chapter. Thus we give an application of the duality theory in the well-developed field of portfolio optimization. We close this thesis by considering a general Support Vector Machines problem and derive its dual using the conjugate duality theory. We give a strong duality result and necessary as well as sufficient optimality conditions. By considering different cost functions we get problems for Support Vector Regression and Support Vector Classification. We extend the results given in the literature by dropping the assumption of invertibility of the kernel matrix. We use a cost function that generalizes the well-known Vapnik's ε-insensitive loss and consider the optimization problems that arise by using this. We show how the general theory can be applied for a real data set, especially we predict the concrete compressive strength by using a special Support Vector Regression problem.
6

Application of the Duality Theory: New Possibilities within the Theory of Risk Measures, Portfolio Optimization and Machine Learning

Lorenz, Nicole 28 June 2012 (has links)
The aim of this thesis is to present new results concerning duality in scalar optimization. We show how the theory can be applied to optimization problems arising in the theory of risk measures, portfolio optimization and machine learning. First we give some notations and preliminaries we need within the thesis. After that we recall how the well-known Lagrange dual problem can be derived by using the general perturbation theory and give some generalized interior point regularity conditions used in the literature. Using these facts we consider some special scalar optimization problems having a composed objective function and geometric (and cone) constraints. We derive their duals, give strong duality results and optimality condition using some regularity conditions. Thus we complete and/or extend some results in the literature especially by using the mentioned regularity conditions, which are weaker than the classical ones. We further consider a scalar optimization problem having single chance constraints and a convex objective function. We also derive its dual, give a strong duality result and further consider a special case of this problem. Thus we show how the conjugate duality theory can be used for stochastic programming problems and extend some results given in the literature. In the third chapter of this thesis we consider convex risk and deviation measures. We present some more general measures than the ones given in the literature and derive formulas for their conjugate functions. Using these we calculate some dual representation formulas for the risk and deviation measures and correct some formulas in the literature. Finally we proof some subdifferential formulas for measures and risk functions by using the facts above. The generalized deviation measures we introduced in the previous chapter can be used to formulate some portfolio optimization problems we consider in the fourth chapter. Their duals, strong duality results and optimality conditions are derived by using the general theory and the conjugate functions, respectively, given in the second and third chapter. Analogous calculations are done for a portfolio optimization problem having single chance constraints using the general theory given in the second chapter. Thus we give an application of the duality theory in the well-developed field of portfolio optimization. We close this thesis by considering a general Support Vector Machines problem and derive its dual using the conjugate duality theory. We give a strong duality result and necessary as well as sufficient optimality conditions. By considering different cost functions we get problems for Support Vector Regression and Support Vector Classification. We extend the results given in the literature by dropping the assumption of invertibility of the kernel matrix. We use a cost function that generalizes the well-known Vapnik's ε-insensitive loss and consider the optimization problems that arise by using this. We show how the general theory can be applied for a real data set, especially we predict the concrete compressive strength by using a special Support Vector Regression problem.

Page generated in 0.1363 seconds