Spelling suggestions: "subject:"ionic quadratic"" "subject:"sonic quadratic""
1 |
Parametric and Multiobjective Optimization with Applications in FinanceRomanko, Oleksandr 03 1900 (has links)
<p> In this thesis parametric analysis for conic quadratic optimization problems
is studied. In parametric analysis, which is often referred to as parametric optimization
or parametric programming, a perturbation parameter is introduced
into the optimization problem, which means that the coefficients in the objective
function of the problem and in the right-hand-side of the constraints are
perturbed. First, we describe linear, convex quadratic and second order cone optimization
problems and their parametric versions. Second, the theory for finding
solutions of the parametric problems is developed. We also present algorithms
for solving such problems. Third, we demonstrate how to use parametric optimization
techniques to solve multiobjective optimization problems and compute
Pareto efficient surfaces. </p> <p> We implement our novel algorithm for hi-parametric quadratic optimization.
It utilizes existing solvers to solve auxiliary problems. We present numerical
results produced by our parametric optimization package on a number of practical
financial and non-financial computational problems. In the latter we consider
problems of drug design and beam intensity optimization for radiation therapy. </p> <p> In the financial applications part, two risk management optimization models
are developed or extended. These two models are a portfolio replication
framework and a credit risk optimization framework. We describe applications
of multiobjective optimization to existing financial models and novel models that
we have developed. We solve a number of examples of financial multiobjective
optimization problems using our parametric optimization algorithms. </p> / Thesis / Doctor of Philosophy (PhD)
|
2 |
Parameter Estimation In Generalized Partial Linear Models With Conic Quadratic ProgrammingCelik, Gul 01 September 2010 (has links) (PDF)
In statistics, regression analysis is a technique, used to understand and model the
relationship between a dependent variable and one or more independent variables.
Multiple Adaptive Regression Spline (MARS) is a form of regression analysis. It is a
non-parametric regression technique and can be seen as an extension of linear models
that automatically models non-linearities and interactions. MARS is very important
in both classification and regression, with an increasing number of applications in
many areas of science, economy and technology.
In our study, we analyzed Generalized Partial Linear Models (GPLMs), which are
particular semiparametric models. GPLMs separate input variables into two parts
and additively integrates classical linear models with nonlinear model part. In order
to smooth this nonparametric part, we use Conic Multiple Adaptive Regression Spline
(CMARS), which is a modified form of MARS. MARS is very benefical for high
dimensional problems and does not require any particular class of relationship between
the regressor variables and outcome variable of interest. This technique offers a great advantage for fitting nonlinear multivariate functions. Also, the contribution of the
basis functions can be estimated by MARS, so that both the additive and interaction
effects of the regressors are allowed to determine the dependent variable. There are
two steps in the MARS algorithm: the forward and backward stepwise algorithms. In
the first step, the model is constructed by adding basis functions until a maximum
level of complexity is reached. Conversely, in the second step, the backward stepwise
algorithm reduces the complexity by throwing the least significant basis functions from
the model.
In this thesis, we suggest not using backward stepwise algorithm, instead, we employ
a Penalized Residual Sum of Squares (PRSS). We construct PRSS for MARS as a
Tikhonov Regularization Problem. We treat this problem using continuous optimization
techniques which we consider to become an important complementary technology
and alternative to the concept of the backward stepwise algorithm. Especially, we apply
the elegant framework of Conic Quadratic Programming (CQP) an area of convex
optimization that is very well-structured, hereby, resembling linear programming and,
therefore, permitting the use of interior point methods.
At the end of this study, we compare CQP with Tikhonov Regularization problem
for two different data sets, which are with and without interaction effects. Moreover,
by using two another data sets, we make a comparison between CMARS and two
other classification methods which are Infinite Kernel Learning (IKL) and Tikhonov
Regularization whose results are obtained from the thesis, which is on progress.
|
3 |
A New Contribution To Nonlinear Robust Regression And Classification With Mars And Its Applications To Data Mining For Quality Control In ManufacturingYerlikaya, Fatma 01 September 2008 (has links) (PDF)
Multivariate adaptive regression spline (MARS) denotes a modern
methodology from statistical learning which is very important
in both classification and regression, with an increasing number of applications in many areas of science, economy and technology.
MARS is very useful for high dimensional problems and shows a great promise for fitting nonlinear multivariate functions. MARS technique does not impose any particular class of relationship between the predictor variables and outcome variable of interest. In other words, a special advantage of MARS lies in its ability to estimate the contribution of the basis functions so that
both the additive and interaction effects of the predictors are allowed to determine the response variable.
The function fitted by MARS is continuous, whereas the one fitted by classical classification methods (CART) is not. Herewith, MARS becomes an alternative to CART. The MARS algorithm for estimating the model function consists of two complementary algorithms: the forward and backward stepwise algorithms. In the first step, the model is built by adding basis functions until a maximum level of complexity is reached. On the other hand, the backward stepwise algorithm is began by removing the least significant basis functions from the model.
In this study, we propose not to use the backward stepwise algorithm. Instead, we construct a penalized residual sum of squares (PRSS) for MARS as a Tikhonov regularization problem, which is also known as ridge regression. We treat this problem using continuous optimization techniques which we consider to
become an important complementary technology and alternative to the concept of the backward stepwise algorithm. In particular, we apply the elegant framework of conic quadratic programming which is an area of convex optimization that
is very well-structured, herewith, resembling linear programming and, hence, permitting the use of interior point methods. The boundaries of this optimization problem are determined by the multiobjective optimization approach which provides us many
alternative solutions.
Based on these theoretical and algorithmical studies, this MSc thesis work also contains applications on the data investigated in a TÜ / BiTAK project on quality control. By these applications, MARS and our new method are compared.
|
4 |
Robust Conic Quadratic Programming Applied To Quality Improvement -a Robustification Of CmarsOzmen, Ayse 01 October 2010 (has links) (PDF)
In this thesis, we study and use Conic Quadratic Programming (CQP) for purposes of operational research, especially, for quality improvement in manufacturing. In previous works, the importance and benefit of CQP in this area became already demonstrated. There, the complexity of the regression method Multivariate Adaptive Regression Spline (MARS), which especially means sensitivity with respect to noise in the data, became penalized in the form of so-called Tikhonov regularization, which became expressed and studied as a CQP problem. This was leading to the new method CMARS / it is more model-based and employs continuous, actually, well-structured convex optimization which enables the use of Interior Point Methods and their codes such as MOSEK. In this study, we are generalizing the regression problem by including uncertainty in the model, especially, in the input data, too.
CMARS, recently developed as an alternative method to MARS, is powerful in overcoming complex and heterogeneous data. However, for MARS and CMARS method, data are assumed to contain fixed variables. In fact, data include noise in both output and input variables. Consequently, optimization problem&rsquo / s solutions can show a remarkable sensitivity to perturbations in the parameters of the problem. In this study, we include the existence of uncertainty in the future scenarios into CMARS and robustify it with robust optimization which is dealt with data uncertainty. That kind of optimization was introduced by Aharon Ben-Tal and Arkadi Nemirovski, and used by Laurent El Ghaoui in the area of data mining. It incorporates various kinds of noise and perturbations into the programming problem. This robustification of CQP with robust optimization is compared with previous contributions that based on Tikhonov regularization, and with the traditional MARS method.
|
Page generated in 0.0761 seconds