The asymptotic properties of discrete time stochastic systems operating under feedback is addressed. It is assumed that a Markov chain $ Phi$ evolving on Euclidean space exists, and that the input and output processes appear as functions of $ Phi$. The main objectives of the thesis are (i) to extend various asymptotic properties of Markov chains to hold for arbitrary initial distributions; and (ii) to develop a robustness theory for Markovian systems. / A condition called local stochastic controllability, a generalization of the concept of controllability from linear system theory, is introduced and is shown to be sufficient to ensure that the first objective is met. The second objective is explored by introducing a notion of convergence for stochastic systems and investigating the behavior of the invariant probabilities corresponding to a convergent sequence of stochastic systems. / These general results are applied to two previously unsolved problems: The asymptotic behavior of linear state space systems operating under nonlinear feedback, and the stability and asymptotic behavior of a class of random parameter AR (p) stochastic systems under optimal control.
Identifer | oai:union.ndltd.org:LACETR/oai:collectionscanada.gc.ca:QMM.75427 |
Date | January 1987 |
Creators | Meyn, S. P. (Sean P.) |
Publisher | McGill University |
Source Sets | Library and Archives Canada ETDs Repository / Centre d'archives des thèses électroniques de Bibliothèque et Archives Canada |
Language | English |
Detected Language | English |
Type | Electronic Thesis or Dissertation |
Format | application/pdf |
Coverage | Doctor of Philosophy (Department of Electrical Engineering.) |
Rights | All items in eScholarship@McGill are protected by copyright with all rights reserved unless otherwise indicated. |
Relation | alephsysno: 000550470, proquestno: AAINL44273, Theses scanned by UMI/ProQuest. |
Page generated in 0.0018 seconds