Spelling suggestions: "subject:"expectationmaximization algorithms."" "subject:"expectationmaximisation algorithms.""
11 |
Improved iterative schemes for REML estimation of variance parameters in linear mixed models.Knight, Emma January 2008 (has links)
Residual maximum likelihood (REML) estimation is a popular method of estimation for variance parameters in linear mixed models, which typically requires an iterative scheme. The aim of this thesis is to review several popular iterative schemes and to develop an improved iterative strategy that will work for a wide class of models. The average information (AI) algorithm is a computationally convenient and efficient algorithm to use when starting values are in the neighbourhood of the REML solution. However when reasonable starting values are not available, the algorithm can fail to converge. The expectation-maximisation (EM) algorithm and the parameter expanded EM (PXEM) algorithm are good alternatives in these situations but they can be very slow to converge. The formulation of these algorithms for a general linear mixed model is presented, along with their convergence properties. A series of hybrid algorithms are presented. EM or PXEM iterations are used initially to obtain variance parameter estimates that are in the neighbourhood of the REML solution, and then AI iterations are used to ensure rapid convergence. Composite local EM/AI and local PXEM/AI schemes are also developed; the local EM and local PXEM algorithms update only the random effect variance parameters, with the estimates of the residual error variance parameters held fixed. Techniques for determining when to use EM-type iterations and when to switch to AI iterations are investigated. Methods for obtaining starting values for the iterative schemes are also presented. The performance of these various schemes is investigated for several different linear mixed models. A number of data sets are used, including published data sets and simulated data. The performance of the basic algorithms is compared to that of the various hybrid algorithms, using both uninformed and informed starting values. The theoretical and empirical convergence rates are calculated and compared for the basic algorithms. The direct comparison of the AI and PXEM algorithms shows that the PXEM algorithm, although an improvement over the EM algorithm, still falls well short of the AI algorithm in terms of speed of convergence. However, when the starting values are too far from the REML solution, the AI algorithm can be unstable. Instability is most likely to arise in models with a more complex variance structure. The hybrid schemes use EM-type iterations to move close enough to the REML solution to enable the AI algorithm to successfully converge. They are shown to be robust to choice of starting values like the EM and PXEM algorithms, while demonstrating fast convergence like the AI algorithm. / Thesis (Ph.D.) - University of Adelaide, School of Agriculture, Food and Wine, 2008
|
12 |
Semiparametric maximum likelihood for regression with measurement errorSuh, Eun-Young 03 May 2001 (has links)
Semiparametric maximum likelihood analysis allows inference in errors-invariables
models with small loss of efficiency relative to full likelihood analysis but
with significantly weakened assumptions. In addition, since no distributional
assumptions are made for the nuisance parameters, the analysis more nearly
parallels that for usual regression. These highly desirable features and the high
degree of modelling flexibility permitted warrant the development of the approach
for routine use. This thesis does so for the special cases of linear and nonlinear
regression with measurement errors in one explanatory variable. A transparent and
flexible computational approach is developed, the analysis is exhibited on some
examples, and finite sample properties of estimates, approximate standard errors,
and likelihood ratio inference are clarified with simulation. / Graduation date: 2001
|
13 |
Mining complex databases using the EM algorithmOrdońẽz, Carlos January 2000 (has links)
No description available.
|
14 |
Robust algorithms for mixture decomposition with application to classification, boundary description, and image retrieval /Medasani, Swarup January 1998 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 1998. / Typescript. Vita. Includes bibliographical references (leaves 216-229). Also available on the Internet.
|
15 |
Robust algorithms for mixture decomposition with application to classification, boundary description, and image retrievalMedasani, Swarup January 1998 (has links)
Thesis (Ph. D.)--University of Missouri-Columbia, 1998. / Typescript. Vita. Includes bibliographical references (leaves 216-229). Also available on the Internet.
|
16 |
Computation of weights for probabilistic record linkage using the EM algorithm /Bauman, G. John, January 2006 (has links) (PDF)
Project (M.S.)--Brigham Young University. Dept. of Statistics, 2006. / Includes bibliographical references (p. 45-46).
|
17 |
Financial filtering and model calibration /Wu, Ping. Feng, Shui, January 1900 (has links)
Thesis (Ph.D.)--McMaster University, 2003. / Advisor: Shui Feng. Includes bibliographical references (leaves 94-102). Also available via World Wide Web.
|
18 |
Acoustic analysis of vocal output characteristics for suicidal risk assessmentYingthawornsuk, Thaweesak. January 2007 (has links)
Thesis (Ph. D. in Electrical Engineering)--Vanderbilt University, Dec. 2007. / Title from title screen. Includes bibliographical references.
|
19 |
Maximum likelihood estimation of nonlinear factor analysis model using MCECM algorithm.January 2005 (has links)
by Long Mei. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2005. / Includes bibliographical references (leaves 73-77). / Abstracts in English and Chinese. / Acknowledgements --- p.iv / Abstract --- p.v / Table of Contents --- p.vii / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Nonlinear Factor Analysis Model --- p.1 / Chapter 1.2 --- Main Objectives --- p.2 / Chapter 1.2.1 --- Investigation of the performance of the ML approach with MCECM algorithm in NFA model --- p.2 / Chapter 1.2.2 --- Investigation of the Robustness of the ML approach with MCECM algorithm --- p.3 / Chapter 1.3 --- Structure of the Thesis --- p.3 / Chapter 2 --- Theoretical Background of the MCECM Algorithm --- p.5 / Chapter 2.1 --- Introduction of the EM algorithm --- p.5 / Chapter 2.2 --- Monte Carlo integration --- p.7 / Chapter 2.3 --- Markov Chains --- p.7 / Chapter 2.4 --- The Metropolis-Hastings algorithm --- p.8 / Chapter 3 --- Maximum Likelihood Estimation of a Nonlinear Factor Analysis Model --- p.10 / Chapter 3.1 --- MCECM Algorithm --- p.10 / Chapter 3.1.1 --- Motivation of Using MCECM algorithm --- p.11 / Chapter 3.1.2 --- Introduction of the Realization of the MCECM algorithm --- p.12 / Chapter 3.1.3 --- Implementation of the E-step via the MH Algorithm --- p.13 / Chapter 3.1.4 --- Maximization Step --- p.15 / Chapter 3.2 --- Monitoring Convergence of MCECM --- p.17 / Chapter 3.2.1 --- Bridge Sampling Method --- p.17 / Chapter 3.2.2 --- Average Batch Mean Method --- p.18 / Chapter 4 --- Simulation Studies --- p.20 / Chapter 4.1 --- The First Simulation Study with the Normal Distribution --- p.20 / Chapter 4.1.1 --- Model Specification --- p.20 / Chapter 4.1.2 --- The Selection of System Parameters --- p.22 / Chapter 4.1.3 --- Monitoring the Convergence --- p.22 / Chapter 4.1.4 --- Simulation Results for the ML Estimates --- p.25 / Chapter 4.2 --- The Second Simulation Study with the Normal Distribution --- p.34 / Chapter 4.2.1 --- Model Specification --- p.34 / Chapter 4.2.2 --- Monitoring the Convergence --- p.35 / Chapter 4.2.3 --- Simulation Results for the ML Estimates --- p.38 / Chapter 4.3 --- The Third Simulation Study on Robustness --- p.47 / Chapter 4.3.1 --- Model Specification --- p.47 / Chapter 4.3.2 --- Monitoring the Convergence --- p.48 / Chapter 4.3.3 --- Simulation Results for the ML Estimates --- p.51 / Chapter 4.4 --- The Fourth Simulation Study on Robustness --- p.59 / Chapter 4.4.1 --- Model Specification --- p.59 / Chapter 4.4.2 --- Monitoring the Convergence --- p.59 / Chapter 4.4.3 --- Simulation Results for the ML Estimates --- p.62 / Chapter 5 --- Conclusion --- p.71 / Bibliography --- p.73
|
20 |
On local and global influence analysis of latent variable models with ML and Bayesian approaches. / CUHK electronic theses & dissertations collectionJanuary 2004 (has links)
Bin Lu. / "September 2004." / Thesis (Ph.D.)--Chinese University of Hong Kong, 2004. / Includes bibliographical references (p. 118-126) / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Mode of access: World Wide Web. / Abstracts in English and Chinese.
|
Page generated in 0.15 seconds