• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

New results in detection, estimation, and model selection

Ni, Xuelei 08 December 2005 (has links)
This thesis contains two parts: the detectability of convex sets and the study on regression models In the first part of this dissertation, we investigate the problem of the detectability of an inhomogeneous convex region in a Gaussian random field. The first proposed detection method relies on checking a constructed statistic on each convex set within an nn image, which is proven to be un-applicable. We then consider using h(v)-parallelograms as the surrogate, which leads to a multiscale strategy. We prove that 2/9 is the minimum proportion of the maximally embedded h(v)-parallelogram in a convex set. Such a constant indicates the effectiveness of the above mentioned multiscale detection method. In the second part, we study the robustness, the optimality, and the computing for regression models. Firstly, for robustness, M-estimators in a regression model where the residuals are of unknown but stochastically bounded distribution are analyzed. An asymptotic minimax M-estimator (RSBN) is derived. Simulations demonstrate the robustness and advantages. Secondly, for optimality, the analysis on the least angle regressions inspired us to consider the conditions under which a vector is the solution of two optimization problems. For these two problems, one can be solved by certain stepwise algorithms, the other is the objective function in many existing subset selection criteria (including Cp, AIC, BIC, MDL, RIC, etc). The latter is proven to be NP-hard. Several conditions are derived. They tell us when a vector is the common optimizer. At last, extending the above idea about finding conditions into exhaustive subset selection in regression, we improve the widely used leaps-and-bounds algorithm (Furnival and Wilson). The proposed method further reduces the number of subsets needed to be considered in the exhaustive subset search by considering not only the residuals, but also the model matrix, and the current coefficients.

Page generated in 0.1029 seconds