• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

ENVELOPE MODEL FOR MULTIVARIATE LINEAR REGRESSION WITH ELLIPTICAL ERROR

Alkan, Gunes, 0000-0001-9356-2173 January 2021 (has links)
In recent years, the need for models which can accommodate higher order covariates have increased greatly. We first consider linear regression with vector-valued response Y and tensor-valued predictors X. Envelope models (Cook et al., 2010) can significantly improve the estimation efficiency of the regression coefficients by linking the regression mean with the covariance of the regression error. Most existing tensor regression models assume that the conditional distribution of Y given X follows a normal distribution, which may be violated in practice. In Chapter 2, we propose an envelope multivariate linear regression model with tensor-valued predictors and elliptically contoured error distributions. The proposed estimator is more robust to violations of the error normality assumption, and it is more efficient than the estimators without considering the underlying envelope structure. We compare the new proposal with existing estimators in extensive simulation studies. In Chapter 3, we explore how the missing data problem can be addressed for multivariate linear regression setting with envelopes and elliptical error. A popular and efficient approach, multiple imputation is implemented with bootstrapped expectation-maximization (EM) algorithm to fill the missing data, which is then followed with an adjustment in estimating regression coefficients. Simulations with synthetic data as well as real data are presented to establish the superiority of the adjusted multiple imputation method proposed. / Statistics
2

Advances on Dimension Reduction for Multivariate Linear Regression

Guo, Wenxing January 2020 (has links)
Multivariate linear regression methods are widely used statistical tools in data analysis, and were developed when some response variables are studied simultaneously, in which our aim is to study the relationship between predictor variables and response variables through the regression coefficient matrix. The rapid improvements of information technology have brought us a large number of large-scale data, but also brought us great challenges in data processing. When dealing with high dimensional data, the classical least squares estimation is not applicable in multivariate linear regression analysis. In recent years, some approaches have been developed to deal with high-dimensional data problems, among which dimension reduction is one of the main approaches. In some literature, random projection methods were used to reduce dimension in large datasets. In Chapter 2, a new random projection method, with low-rank matrix approximation, is proposed to reduce the dimension of the parameter space in high-dimensional multivariate linear regression model. Some statistical properties of the proposed method are studied and explicit expressions are then derived for the accuracy loss of the method with Gaussian random projection and orthogonal random projection. These expressions are precise rather than being bounds up to constants. In multivariate regression analysis, reduced rank regression is also a dimension reduction method, which has become an important tool for achieving dimension reduction goals due to its simplicity, computational efficiency and good predictive performance. In practical situations, however, the performance of the reduced rank estimator is not satisfactory when the predictor variables are highly correlated or the ratio of signal to noise is small. To overcome this problem, in Chapter 3, we incorporate matrix projections into reduced rank regression method, and then develop reduced rank regression estimators based on random projection and orthogonal projection in high-dimensional multivariate linear regression models. We also propose a consistent estimator of the rank of the coefficient matrix and achieve prediction performance bounds for the proposed estimators based on mean squared errors. Envelope technology is also a popular method in recent years to reduce estimative and predictive variations in multivariate regression, including a class of methods to improve the efficiency without changing the traditional objectives. Variable selection is the process of selecting a subset of relevant features variables for use in model construction. The purpose of using this technology is to avoid the curse of dimensionality, simplify models to make them easier to interpret, shorten training time and reduce overfitting. In Chapter 4, we combine envelope models and a group variable selection method to propose an envelope-based sparse reduced rank regression estimator in high-dimensional multivariate linear regression models, and then establish its consistency, asymptotic normality and oracle property. Tensor data are in frequent use today in a variety of fields in science and engineering. Processing tensor data is a practical but challenging problem. Recently, the prevalence of tensor data has resulted in several envelope tensor versions. In Chapter 5, we incorporate envelope technique into tensor regression analysis and propose a partial tensor envelope model, which leads to a parsimonious version for tensor response regression when some predictors are of special interest, and then consistency and asymptotic normality of the coefficient estimators are proved. The proposed method achieves significant gains in efficiency compared to the standard tensor response regression model in terms of the estimation of the coefficients for the selected predictors. Finally, in Chapter 6, we summarize the work carried out in the thesis, and then suggest some problems of further research interest. / Dissertation / Doctor of Philosophy (PhD)

Page generated in 0.0546 seconds