Return to search

Theoretical Results and Applications Related to Dimension Reduction

To overcome the curse of dimensionality, dimension reduction is important and
necessary for understanding the underlying phenomena in a variety of fields.
Dimension reduction is the transformation of high-dimensional data into a
meaningful representation in the low-dimensional space. It can be further
classified into feature selection and feature extraction. In this thesis, which
is composed of four projects, the first two focus on feature selection, and the
last two concentrate on feature extraction.

The content of the thesis is as follows. The first project presents several
efficient methods for the sparse representation of a multiple measurement
vector (MMV); some theoretical properties of the algorithms are also discussed.
The second project introduces the NP-hardness problem for penalized likelihood
estimators, including penalized least squares estimators, penalized least
absolute deviation regression and penalized support vector machines. The third
project focuses on the application of manifold learning in the analysis and
prediction of 24-hour electricity price curves. The last project proposes a new
hessian regularized nonlinear time-series model for prediction in time series.

Identiferoai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/19841
Date01 November 2007
CreatorsChen, Jie
PublisherGeorgia Institute of Technology
Source SetsGeorgia Tech Electronic Thesis and Dissertation Archive
Detected LanguageEnglish
TypeDissertation

Page generated in 0.0018 seconds