Spelling suggestions: "subject:"monotoner cooperator"" "subject:"monotoner inoperator""
1 |
Overcoming the failure of the classical generalized interior-point regularity conditions in convex optimization. Applications of the duality theory to enlargements of maximal monotone operatorsCsetnek, Ernö Robert 14 December 2009 (has links) (PDF)
The aim of this work is to present several new results concerning
duality in scalar convex optimization, the formulation of sequential
optimality conditions and some applications of the duality to the theory
of maximal monotone operators.
After recalling some properties of the classical generalized
interiority notions which exist in the literature, we give some
properties of the quasi interior and quasi-relative interior,
respectively. By means of these notions we introduce several
generalized interior-point regularity conditions which guarantee
Fenchel duality. By using an approach due to Magnanti, we derive
corresponding regularity conditions expressed via the quasi
interior and quasi-relative interior which ensure Lagrange
duality. These conditions have the advantage to be applicable in
situations when other classical regularity conditions fail.
Moreover, we notice that several duality results given in the
literature on this topic have either superfluous or contradictory
assumptions, the investigations we make offering in this sense an
alternative.
Necessary and sufficient sequential optimality conditions for a
general convex optimization problem are established via
perturbation theory. These results are applicable even in the
absence of regularity conditions. In particular, we show that
several results from the literature dealing with sequential
optimality conditions are rediscovered and even improved.
The second part of the thesis is devoted to applications of the
duality theory to enlargements of maximal monotone operators in
Banach spaces. After establishing a necessary and sufficient
condition for a bivariate infimal convolution formula, by
employing it we equivalently characterize the
$\varepsilon$-enlargement of the sum of two maximal monotone
operators. We generalize in this way a classical result
concerning the formula for the $\varepsilon$-subdifferential of
the sum of two proper, convex and lower semicontinuous functions.
A characterization of fully enlargeable monotone operators is also
provided, offering an answer to an open problem stated in the
literature. Further, we give a regularity condition for the
weak$^*$-closedness of the sum of the images of enlargements of
two maximal monotone operators.
The last part of this work deals with enlargements of positive sets in SSD spaces. It is shown that many results from the literature concerning enlargements of maximal monotone operators can be generalized to the setting of Banach SSD spaces.
|
2 |
Overcoming the failure of the classical generalized interior-point regularity conditions in convex optimization. Applications of the duality theory to enlargements of maximal monotone operatorsCsetnek, Ernö Robert 08 December 2009 (has links)
The aim of this work is to present several new results concerning
duality in scalar convex optimization, the formulation of sequential
optimality conditions and some applications of the duality to the theory
of maximal monotone operators.
After recalling some properties of the classical generalized
interiority notions which exist in the literature, we give some
properties of the quasi interior and quasi-relative interior,
respectively. By means of these notions we introduce several
generalized interior-point regularity conditions which guarantee
Fenchel duality. By using an approach due to Magnanti, we derive
corresponding regularity conditions expressed via the quasi
interior and quasi-relative interior which ensure Lagrange
duality. These conditions have the advantage to be applicable in
situations when other classical regularity conditions fail.
Moreover, we notice that several duality results given in the
literature on this topic have either superfluous or contradictory
assumptions, the investigations we make offering in this sense an
alternative.
Necessary and sufficient sequential optimality conditions for a
general convex optimization problem are established via
perturbation theory. These results are applicable even in the
absence of regularity conditions. In particular, we show that
several results from the literature dealing with sequential
optimality conditions are rediscovered and even improved.
The second part of the thesis is devoted to applications of the
duality theory to enlargements of maximal monotone operators in
Banach spaces. After establishing a necessary and sufficient
condition for a bivariate infimal convolution formula, by
employing it we equivalently characterize the
$\varepsilon$-enlargement of the sum of two maximal monotone
operators. We generalize in this way a classical result
concerning the formula for the $\varepsilon$-subdifferential of
the sum of two proper, convex and lower semicontinuous functions.
A characterization of fully enlargeable monotone operators is also
provided, offering an answer to an open problem stated in the
literature. Further, we give a regularity condition for the
weak$^*$-closedness of the sum of the images of enlargements of
two maximal monotone operators.
The last part of this work deals with enlargements of positive sets in SSD spaces. It is shown that many results from the literature concerning enlargements of maximal monotone operators can be generalized to the setting of Banach SSD spaces.
|
3 |
Proximal Splitting Methods in Nonsmooth Convex OptimizationHendrich, Christopher 25 July 2014 (has links) (PDF)
This thesis is concerned with the development of novel numerical methods for solving nondifferentiable convex optimization problems in real Hilbert spaces and with the investigation of their asymptotic behavior. To this end, we are also making use of monotone operator theory as some of the provided algorithms are originally designed to solve monotone inclusion problems.
After introducing basic notations and preliminary results in convex analysis, we derive two numerical methods based on different smoothing strategies for solving nondifferentiable convex optimization problems. The first approach, known as the double smoothing technique, solves the optimization problem with some given a priori accuracy by applying two regularizations to its conjugate dual problem. A special fast gradient method then solves the regularized dual problem such that an approximate primal solution can be reconstructed from it. The second approach affects the primal optimization problem directly by applying a single regularization to it and is capable of using variable smoothing parameters which lead to a more accurate approximation of the original problem as the iteration counter increases. We then derive and investigate different primal-dual methods in real Hilbert spaces. In general, one considerable advantage of primal-dual algorithms is that they are providing a complete splitting philosophy in that the resolvents, which arise in the iterative process, are only taken separately from each maximally monotone operator occurring in the problem description. We firstly analyze the forward-backward-forward algorithm of Combettes and Pesquet in terms of its convergence rate for the objective of a nondifferentiable convex optimization problem. Additionally, we propose accelerations of this method under the additional assumption that certain monotone operators occurring in the problem formulation are strongly monotone. Subsequently, we derive two Douglas–Rachford type primal-dual methods for solving monotone inclusion problems involving finite sums of linearly composed parallel sum type monotone operators. To prove their asymptotic convergence, we use a common product Hilbert space strategy by reformulating the corresponding inclusion problem reasonably such that the Douglas–Rachford algorithm can be applied to it. Finally, we propose two primal-dual algorithms relying on forward-backward and forward-backward-forward approaches for solving monotone inclusion problems involving parallel sums of linearly composed monotone operators.
The last part of this thesis deals with different numerical experiments where we intend to compare our methods against algorithms from the literature. The problems which arise in this part are manifold and they reflect the importance of this field of research as convex optimization problems appear in lots of applications of interest.
|
4 |
Proximal Splitting Methods in Nonsmooth Convex OptimizationHendrich, Christopher 17 July 2014 (has links)
This thesis is concerned with the development of novel numerical methods for solving nondifferentiable convex optimization problems in real Hilbert spaces and with the investigation of their asymptotic behavior. To this end, we are also making use of monotone operator theory as some of the provided algorithms are originally designed to solve monotone inclusion problems.
After introducing basic notations and preliminary results in convex analysis, we derive two numerical methods based on different smoothing strategies for solving nondifferentiable convex optimization problems. The first approach, known as the double smoothing technique, solves the optimization problem with some given a priori accuracy by applying two regularizations to its conjugate dual problem. A special fast gradient method then solves the regularized dual problem such that an approximate primal solution can be reconstructed from it. The second approach affects the primal optimization problem directly by applying a single regularization to it and is capable of using variable smoothing parameters which lead to a more accurate approximation of the original problem as the iteration counter increases. We then derive and investigate different primal-dual methods in real Hilbert spaces. In general, one considerable advantage of primal-dual algorithms is that they are providing a complete splitting philosophy in that the resolvents, which arise in the iterative process, are only taken separately from each maximally monotone operator occurring in the problem description. We firstly analyze the forward-backward-forward algorithm of Combettes and Pesquet in terms of its convergence rate for the objective of a nondifferentiable convex optimization problem. Additionally, we propose accelerations of this method under the additional assumption that certain monotone operators occurring in the problem formulation are strongly monotone. Subsequently, we derive two Douglas–Rachford type primal-dual methods for solving monotone inclusion problems involving finite sums of linearly composed parallel sum type monotone operators. To prove their asymptotic convergence, we use a common product Hilbert space strategy by reformulating the corresponding inclusion problem reasonably such that the Douglas–Rachford algorithm can be applied to it. Finally, we propose two primal-dual algorithms relying on forward-backward and forward-backward-forward approaches for solving monotone inclusion problems involving parallel sums of linearly composed monotone operators.
The last part of this thesis deals with different numerical experiments where we intend to compare our methods against algorithms from the literature. The problems which arise in this part are manifold and they reflect the importance of this field of research as convex optimization problems appear in lots of applications of interest.
|
Page generated in 0.0596 seconds