To overcome the curse of dimensionality, dimension reduction is important and
necessary for understanding the underlying phenomena in a variety of fields.
Dimension reduction is the transformation of high-dimensional data into a
meaningful representation in the low-dimensional space. It can be further
classified into feature selection and feature extraction. In this thesis, which
is composed of four projects, the first two focus on feature selection, and the
last two concentrate on feature extraction.
The content of the thesis is as follows. The first project presents several
efficient methods for the sparse representation of a multiple measurement
vector (MMV); some theoretical properties of the algorithms are also discussed.
The second project introduces the NP-hardness problem for penalized likelihood
estimators, including penalized least squares estimators, penalized least
absolute deviation regression and penalized support vector machines. The third
project focuses on the application of manifold learning in the analysis and
prediction of 24-hour electricity price curves. The last project proposes a new
hessian regularized nonlinear time-series model for prediction in time series.
Identifer | oai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/19841 |
Date | 01 November 2007 |
Creators | Chen, Jie |
Publisher | Georgia Institute of Technology |
Source Sets | Georgia Tech Electronic Thesis and Dissertation Archive |
Detected Language | English |
Type | Dissertation |
Page generated in 0.0019 seconds