Return to search

Probabilistic Independence Networks for Hidden Markov Probability Models

Graphical techniques for modeling the dependencies of randomvariables have been explored in a variety of different areas includingstatistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics.Formalisms for manipulating these models have been developedrelatively independently in these research communities. In this paper weexplore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independencenetworks (PINs). The paper contains a self-contained review of the basic principles of PINs.It is shown that the well-known forward-backward (F-B) and Viterbialgorithms for HMMs are special cases of more general inference algorithms forarbitrary PINs. Furthermore, the existence of inference and estimationalgorithms for more general graphical models provides a set of analysistools for HMM practitioners who wish to explore a richer class of HMMstructures.Examples of relatively complex models to handle sensorfusion and coarticulationin speech recognitionare introduced and treated within the graphical model framework toillustrate the advantages of the general approach.

Identiferoai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/7185
Date13 March 1996
CreatorsSmyth, Padhraic, Heckerman, David, Jordan, Michael
Source SetsM.I.T. Theses and Dissertation
Languageen_US
Detected LanguageEnglish
Format31 p., 664995 bytes, 687871 bytes, application/postscript, application/pdf
RelationAIM-1565, CBCL-132

Page generated in 0.0025 seconds