Over the years analysts have used the EM algorithm to obtain maximum likelihood estimates from incomplete data for various models. The general algorithm admits several appealing properties such as strong global convergence; however, the rate of convergence is linear which in some cases may be unacceptably slow. This work is primarily concerned with applying Anderson acceleration to the EM algorithm for Gaussian mixture models (GMM) in hopes of alleviating slow convergence. As preamble we provide a review of maximum likelihood estimation and derive the EM algorithm in detail. The iterates that correspond to the GMM are then formulated and examples are provided. These examples show how faster convergence is experienced when the data are well separated, whereas much slower convergence is seen whenever the sample is poorly separated. The Anderson acceleration method is then presented, and its connection to the EM algorithm is discussed. The work is then concluded by applying Anderson acceleration to the EM algorithm which results in reducing the number of iterations required to obtain convergence.
Identifer | oai:union.ndltd.org:wpi.edu/oai:digitalcommons.wpi.edu:etd-theses-1289 |
Date | 25 April 2013 |
Creators | Plasse, Joshua H |
Contributors | Homer F. Walker, Advisor, , |
Publisher | Digital WPI |
Source Sets | Worcester Polytechnic Institute |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Masters Theses (All Theses, All Years) |
Page generated in 0.0019 seconds