Return to search

Parameter Estimation In Generalized Partial Linear Models With Conic Quadratic Programming

In statistics, regression analysis is a technique, used to understand and model the
relationship between a dependent variable and one or more independent variables.
Multiple Adaptive Regression Spline (MARS) is a form of regression analysis. It is a
non-parametric regression technique and can be seen as an extension of linear models
that automatically models non-linearities and interactions. MARS is very important
in both classification and regression, with an increasing number of applications in
many areas of science, economy and technology.
In our study, we analyzed Generalized Partial Linear Models (GPLMs), which are
particular semiparametric models. GPLMs separate input variables into two parts
and additively integrates classical linear models with nonlinear model part. In order
to smooth this nonparametric part, we use Conic Multiple Adaptive Regression Spline
(CMARS), which is a modified form of MARS. MARS is very benefical for high
dimensional problems and does not require any particular class of relationship between
the regressor variables and outcome variable of interest. This technique offers a great advantage for fitting nonlinear multivariate functions. Also, the contribution of the
basis functions can be estimated by MARS, so that both the additive and interaction
effects of the regressors are allowed to determine the dependent variable. There are
two steps in the MARS algorithm: the forward and backward stepwise algorithms. In
the first step, the model is constructed by adding basis functions until a maximum
level of complexity is reached. Conversely, in the second step, the backward stepwise
algorithm reduces the complexity by throwing the least significant basis functions from
the model.
In this thesis, we suggest not using backward stepwise algorithm, instead, we employ
a Penalized Residual Sum of Squares (PRSS). We construct PRSS for MARS as a
Tikhonov Regularization Problem. We treat this problem using continuous optimization
techniques which we consider to become an important complementary technology
and alternative to the concept of the backward stepwise algorithm. Especially, we apply
the elegant framework of Conic Quadratic Programming (CQP) an area of convex
optimization that is very well-structured, hereby, resembling linear programming and,
therefore, permitting the use of interior point methods.
At the end of this study, we compare CQP with Tikhonov Regularization problem
for two different data sets, which are with and without interaction effects. Moreover,
by using two another data sets, we make a comparison between CMARS and two
other classification methods which are Infinite Kernel Learning (IKL) and Tikhonov
Regularization whose results are obtained from the thesis, which is on progress.

Identiferoai:union.ndltd.org:METU/oai:etd.lib.metu.edu.tr:http://etd.lib.metu.edu.tr/upload/12612531/index.pdf
Date01 September 2010
CreatorsCelik, Gul
ContributorsWeber, Gerhard-wilhelm
PublisherMETU
Source SetsMiddle East Technical Univ.
LanguageEnglish
Detected LanguageEnglish
TypeM.S. Thesis
Formattext/pdf
RightsTo liberate the content for public access

Page generated in 0.0014 seconds