• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Linear Feature Extraction with Emphasis on Face Recognition

Mahanta, Mohammad Shahin 15 February 2010 (has links)
Feature extraction is an important step in the classification of high-dimensional data such as face images. Furthermore, linear feature extractors are more prevalent due to computational efficiency and preservation of the Gaussianity. This research proposes a simple and fast linear feature extractor approximating the sufficient statistic for Gaussian distributions. This method preserves the discriminatory information in both first and second moments of the data and yields the linear discriminant analysis as a special case. Additionally, an accurate upper bound on the error probability of a plug-in classifier can be used to approximate the number of features minimizing the error probability. Therefore, tighter error bounds are derived in this work based on the Bayes error or the classification error on the trained distributions. These bounds can also be used for performance guarantee and to determine the required number of training samples to guarantee approaching the Bayes classifier performance.
2

Linear Feature Extraction with Emphasis on Face Recognition

Mahanta, Mohammad Shahin 15 February 2010 (has links)
Feature extraction is an important step in the classification of high-dimensional data such as face images. Furthermore, linear feature extractors are more prevalent due to computational efficiency and preservation of the Gaussianity. This research proposes a simple and fast linear feature extractor approximating the sufficient statistic for Gaussian distributions. This method preserves the discriminatory information in both first and second moments of the data and yields the linear discriminant analysis as a special case. Additionally, an accurate upper bound on the error probability of a plug-in classifier can be used to approximate the number of features minimizing the error probability. Therefore, tighter error bounds are derived in this work based on the Bayes error or the classification error on the trained distributions. These bounds can also be used for performance guarantee and to determine the required number of training samples to guarantee approaching the Bayes classifier performance.
3

Statistical Inference

Chou, Pei-Hsin 26 June 2008 (has links)
In this paper, we will investigate the important properties of three major parts of statistical inference: point estimation, interval estimation and hypothesis testing. For point estimation, we consider the two methods of finding estimators: moment estimators and maximum likelihood estimators, and three methods of evaluating estimators: mean squared error, best unbiased estimators and sufficiency and unbiasedness. For interval estimation, we consider the the general confidence interval, confidence interval in one sample, confidence interval in two samples, sample sizes and finite population correction factors. In hypothesis testing, we consider the theory of testing of hypotheses, testing in one sample, testing in two samples, and the three methods of finding tests: uniformly most powerful test, likelihood ratio test and goodness of fit test. Many examples are used to illustrate their applications.

Page generated in 0.0696 seconds