Return to search

Optimal hidden Markov models

In contrast with training algorithms such as Baum-Welch, which produce solutions that are a local optimum of the objective function, this thesis describes the attempt to develop a training algorithm which delivers the global optimum Discrete ICdden Markov Model for a given training sequence. A total of four different methods of attack upon the problem are presented. First, after building the necessary analytical tools, the thesis presents a direct, calculus-based assault featuring Matrix Derivatives. Next, the dual analytic approach known as Geometric Programming is examined and then adapted to the task. After that, a hill-climbing formula is developed and applied. These first three methods reveal a number of interesting and useful insights into the problem. However, it is the fourth method which produces an algorithm that is then used for direct comparison vAth the Baum-Welch algorithm: examples of global optima are collected, examined for common features and patterns, and then a rule is induced. The resulting rule is implemented in *C' and tested against a battery of Baum-Welch based programs. In the limited range of tests carried out to date, the models produced by the new algorithm yield optima which have not been surpassed by (and are typically much better than) the Baum-Welch models. However, far more analysis and testing is required and in its current form the algorithm is not fast enough for realistic application.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:310543
Date January 1999
CreatorsMcKee, Bill Frederick
PublisherUniversity of Plymouth
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://hdl.handle.net/10026.1/1698

Page generated in 0.0026 seconds