Return to search

On Convergence Properties of the EM Algorithm for Gaussian Mixtures

"Expectation-Maximization'' (EM) algorithm and gradient-based approaches for maximum likelihood learning of finite Gaussian mixtures. We show that the EM step in parameter space is obtained from the gradient via a projection matrix $P$, and we provide an explicit expression for the matrix. We then analyze the convergence of EM in terms of special properties of $P$ and provide new results analyzing the effect that $P$ has on the likelihood surface. Based on these mathematical results, we present a comparative discussion of the advantages and disadvantages of EM and other algorithms for the learning of Gaussian mixture models.

Identiferoai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/7195
Date21 April 1995
CreatorsJordan, Michael, Xu, Lei
Source SetsM.I.T. Theses and Dissertation
Languageen_US
Detected LanguageEnglish
Format9 p., 291671 bytes, 476864 bytes, application/postscript, application/pdf
RelationAIM-1520, CBCL-111

Page generated in 0.0025 seconds