1 |
Markov Approximations: The Characterization of Undermodeling ErrorsLei, Lei 04 July 2006 (has links) (PDF)
This thesis is concerned with characterizing the quality of Hidden Markov modeling when learning from limited data. It introduces a new perspective on different sources of errors to describe the impact of undermodeling. Our view is that modeling errors can be decomposed into two primary sources of errors: the approximation error and the estimation error. This thesis takes a first step towards exploring the approximation error of low order HMMs that best approximate the true system of a HMM. We introduce the notion minimality and show that best approximations of the true system with complexity greater or equal to the order of a minimal system are actually equivalent realizations. Understanding this further allows us to explore integer lumping and to present a new way named weighted lumping to find realizations. We also show that best approximations of order strictly less than that of a minimal realization are truly approximations; they are incapable of mimicking the true system exactly. Our work then proves that the resulting approximation error is non-decreasing as the model order decreases, verifying the intuitive idea that increasingly simplified models are less and less descriptive of the true system.
|
Page generated in 0.0707 seconds