Spelling suggestions: "subject:"martial linear models"" "subject:"artial linear models""
1 |
Parameter Estimation In Generalized Partial Linear Models With Conic Quadratic ProgrammingCelik, Gul 01 September 2010 (has links) (PDF)
In statistics, regression analysis is a technique, used to understand and model the
relationship between a dependent variable and one or more independent variables.
Multiple Adaptive Regression Spline (MARS) is a form of regression analysis. It is a
non-parametric regression technique and can be seen as an extension of linear models
that automatically models non-linearities and interactions. MARS is very important
in both classification and regression, with an increasing number of applications in
many areas of science, economy and technology.
In our study, we analyzed Generalized Partial Linear Models (GPLMs), which are
particular semiparametric models. GPLMs separate input variables into two parts
and additively integrates classical linear models with nonlinear model part. In order
to smooth this nonparametric part, we use Conic Multiple Adaptive Regression Spline
(CMARS), which is a modified form of MARS. MARS is very benefical for high
dimensional problems and does not require any particular class of relationship between
the regressor variables and outcome variable of interest. This technique offers a great advantage for fitting nonlinear multivariate functions. Also, the contribution of the
basis functions can be estimated by MARS, so that both the additive and interaction
effects of the regressors are allowed to determine the dependent variable. There are
two steps in the MARS algorithm: the forward and backward stepwise algorithms. In
the first step, the model is constructed by adding basis functions until a maximum
level of complexity is reached. Conversely, in the second step, the backward stepwise
algorithm reduces the complexity by throwing the least significant basis functions from
the model.
In this thesis, we suggest not using backward stepwise algorithm, instead, we employ
a Penalized Residual Sum of Squares (PRSS). We construct PRSS for MARS as a
Tikhonov Regularization Problem. We treat this problem using continuous optimization
techniques which we consider to become an important complementary technology
and alternative to the concept of the backward stepwise algorithm. Especially, we apply
the elegant framework of Conic Quadratic Programming (CQP) an area of convex
optimization that is very well-structured, hereby, resembling linear programming and,
therefore, permitting the use of interior point methods.
At the end of this study, we compare CQP with Tikhonov Regularization problem
for two different data sets, which are with and without interaction effects. Moreover,
by using two another data sets, we make a comparison between CMARS and two
other classification methods which are Infinite Kernel Learning (IKL) and Tikhonov
Regularization whose results are obtained from the thesis, which is on progress.
|
2 |
Modelos lineares parciais aditivos generalizados com suavização por meio de P-splines / Generalized additive partial linear models with P-splines smoothingHolanda, Amanda Amorim 03 May 2018 (has links)
Neste trabalho apresentamos os modelos lineares parciais generalizados com uma variável explicativa contínua tratada de forma não paramétrica e os modelos lineares parciais aditivos generalizados com no mínimo duas variáveis explicativas contínuas tratadas de tal forma. São utilizados os P-splines para descrever a relação da variável resposta com as variáveis explicativas contínuas. Sendo assim, as funções de verossimilhança penalizadas, as funções escore penalizadas e as matrizes de informação de Fisher penalizadas são desenvolvidas para a obtenção das estimativas de máxima verossimilhança penalizadas por meio da combinação do algoritmo backfitting (Gauss-Seidel) e do processo iterativo escore de Fisher para os dois tipos de modelo. Em seguida, são apresentados procedimentos para a estimação do parâmetro de suavização, bem como dos graus de liberdade efetivos. Por fim, com o objetivo de ilustração, os modelos propostos são ajustados à conjuntos de dados reais. / In this work we present the generalized partial linear models with one continuous explanatory variable treated nonparametrically and the generalized additive partial linear models with at least two continuous explanatory variables treated in such a way. The P-splines are used to describe the relationship among the response and the continuous explanatory variables. Then, the penalized likelihood functions, penalized score functions and penalized Fisher information matrices are derived to obtain the penalized maximum likelihood estimators by the combination of the backfitting (Gauss-Seidel) algorithm and the Fisher escoring iterative method for the two types of model. In addition, we present ways to estimate the smoothing parameter as well as the effective degrees of freedom. Finally, for the purpose of illustration, the proposed models are fitted to real data sets.
|
3 |
Modelos lineares parciais aditivos generalizados com suavização por meio de P-splines / Generalized additive partial linear models with P-splines smoothingAmanda Amorim Holanda 03 May 2018 (has links)
Neste trabalho apresentamos os modelos lineares parciais generalizados com uma variável explicativa contínua tratada de forma não paramétrica e os modelos lineares parciais aditivos generalizados com no mínimo duas variáveis explicativas contínuas tratadas de tal forma. São utilizados os P-splines para descrever a relação da variável resposta com as variáveis explicativas contínuas. Sendo assim, as funções de verossimilhança penalizadas, as funções escore penalizadas e as matrizes de informação de Fisher penalizadas são desenvolvidas para a obtenção das estimativas de máxima verossimilhança penalizadas por meio da combinação do algoritmo backfitting (Gauss-Seidel) e do processo iterativo escore de Fisher para os dois tipos de modelo. Em seguida, são apresentados procedimentos para a estimação do parâmetro de suavização, bem como dos graus de liberdade efetivos. Por fim, com o objetivo de ilustração, os modelos propostos são ajustados à conjuntos de dados reais. / In this work we present the generalized partial linear models with one continuous explanatory variable treated nonparametrically and the generalized additive partial linear models with at least two continuous explanatory variables treated in such a way. The P-splines are used to describe the relationship among the response and the continuous explanatory variables. Then, the penalized likelihood functions, penalized score functions and penalized Fisher information matrices are derived to obtain the penalized maximum likelihood estimators by the combination of the backfitting (Gauss-Seidel) algorithm and the Fisher escoring iterative method for the two types of model. In addition, we present ways to estimate the smoothing parameter as well as the effective degrees of freedom. Finally, for the purpose of illustration, the proposed models are fitted to real data sets.
|
Page generated in 0.061 seconds