Spelling suggestions: "subject:"markov processes."" "subject:"darkov processes.""
81 |
Loops and points of density in doubly stochastic measures /Shiflett, Ray Calvin. January 1968 (has links)
Thesis (Ph. D.)--Oregon State University, 1968. / Typescript (photocopy). Includes bibliographical references (leaves 87-88). Also available on the World Wide Web.
|
82 |
Parametric POMDPs for planning in continuous state spacesBrooks, Alex M. January 2007 (has links)
Thesis (Ph. D.)--University of Sydney, 2007. / Title from title screen (viewed 15 January 2009). Submitted in fulfilment of the requirements for the degree of Doctor of Philosophy to the Australian Centre for Field Robotics, School of Aerospace, Mechanical and Mechatronic Engineering. Includes bibliographical references. Also available in print form.
|
83 |
Contractive Markov systemsWerner, Ivan January 2004 (has links)
We introduce a theory of contractive Markov systems (CMS) which provides a unifying framework in so-called "fractal" geometry. It extends the known theory of iterated function systems (IFS) with place dependent probabilities [1][8] in a way that it also covers graph directed constructions of "fractal" sets [18]. Such systems naturally extend finite Markov chains and inherit some of their properties. In Chapter 1, we consider iterations of a Markov system and show that they preserve the essential structure of it. In Chapter 2, we show that the Markov operator defined by such a system has a unique invariant probability measure in the irreducible case and an attractive probability measure in the aperiodic case if the restrictions of the probability functions on their vertex sets are Dini-continuous and bounded away from zero, and the system satisfies a condition of a contractiveness on average. This generalizes a result from [1]. Furthermore, we show that the rate of convergence to the stationary state is exponential in the aperiodic case with constant probabilities and a compact state space. In Chapter 3, we construct a coding map for a contractive Markov system. In Chapter 4, we calculate Kolmogorov-Sinai entropy of the generalized Markov shift. In Chapter 5, we prove an ergodic theorem for Markov chains associated with the contractive Markov systems. It generalizes the ergodic theorem of Elton [8].
|
84 |
Mixtures of exponential and geometric distributions, clumped Markov models with applications to biomedical researchTan, Jen Ning January 2010 (has links)
No description available.
|
85 |
Markov chains : a graph theoretical approachMarcon, Sinclair Antony 01 May 2013 (has links)
M.Sc. (Mathematics) / In chapter 1, we give the reader some background concerning digraphs that are used in the discussion of Markov chains; namely, their Markov digraphs. Warshall’s Algorithm for reachability is also introduced as this is used to define terms such as transient states and irreducibility. Some initial theory and definitions concerning Markov chains and their corresponding Markov digraphs are given in chapter 2. Furthermore, we discuss l–step transitions (walks of length l) for homogeneous and inhomogeneous Markov chains and other generalizations. In chapter 3, we define terms such as communication, intercommunication, recurrence and transience. We also prove some results regarding the irreducibility of some Markov chains through the use of the reachability matrix. Furthermore, periodicity and aperiodicity are also investigated and the existence of walks of any length greater than some specified integer N is also considered. A discussion on random walks on an undirected torus is also contained in this chapter. In chapter 4, we explore stationary distributions and what it means for a Markov chain to be reversible. Furthermore, the hitting time and the mean hitting time in a Markov digraph are also defined and the proof of the theorems regarding them are done. The demonstrations of the theorems concerning the existence and uniqueness of stationary distributions and the Markov Chain Convergence Theorem are carried out. Later in this chapter, we define the Markov digraph of undirected graphs, which are Markov chains as well. The existing theory is then applied to these. In chapter 5, we explore and see how to simulate Markov chains on a computer by using Markov Chain Monte Carlo Methods. We also show how these apply to random q–colourings of undirected graphs. Finally, in chapter 6, we consider a physical application of these Graph Theoretical concepts—the simulation of the Ising model. Initially, the relevant concepts of the Potts model are given and then the Gibbs sampler algorithm in chapter 5 is modified and used to simulate the Ising model. A relation between the chromatic polynomial and the partition function of the Potts model is also demonstrated.
|
86 |
POMDP compression and decomposition via belief state analysisLi, Xin 01 January 2009 (has links)
No description available.
|
87 |
An experimental evaluation of Markov channel modelsZhou, Wenge 05 September 2012 (has links)
M.Ing. / The main contribution of this thesis can be summarized as follows. Firstly, we implemented a high speed error gap recording system, which can run with at most with a slight adaptation, on any personal computer, doing gap recording for digital communication systems, achieving a data rate of up to 200 kbits/sec. Secondly, we extended previous experimental investigations of the Fritchman channel model to three other channel models, namely the Gilbert, Gilbert-Elliott, and Aldridge- Ghanbari models, and implemented these models experimentally by a set of computer programs. Thirdly, we investigated the statistical modeling of two analog channels, employed for digital transmission. These are an analog audio cassette, magnetic tape recorder and an analog cordless telephone operating in the 46/49 MHz. band, in the CT1 system. No evidence could be found in the literature of modeling error distributions on these two channels.
|
88 |
Localisation en espace de la propriété de Feller avec application aux processus de type Lévy / Space localisation of the Feller property with application to Lévy-type processesHaugomat, Tristan 11 July 2018 (has links)
Dans cette thèse, nous donnons une localisation en espace de la théorie des processus de Feller. Un premier objectif est d’obtenir des résultats simples et précis sur la convergence de processus de Markov. Un second objectif est d’étudier le lien entre les notions de propriété de Feller, problème de martingales et topologie de Skorokhod. Dans un premier temps nous donnons une version localisée de la topologie de Skorokhod. Nous en étudions les notions de compacité et tension. Nous faisons le lien entre les topologies de Skorokhod localisée et non localisée, grâce à la notion de changement de temps. Dans un second temps, à l’aide de la topologie de Skorokhod localisée et du changement de temps, nous étudions les problèmes de martingales. Nous montrons pour des processus l’équivalence entre, d’une part, être solution d’un problème de martingales bien posé, d’autre part, vérifier une version localisée de la propriété de Feller, et enfin, être markovien et continu en loi par rapport à sa condition initiale. Nous caractérisons la convergence en loi pour les solutions de problèmes de martingale en terme de convergence des opérateurs associés et donnons un résultat similaire pour les approximations à temps discret. Pour finir, nous appliquons la théorie des processus localement fellerien à deux exemples. Nous l’appliquons d’abord au processus de type Lévy et obtenons des résultats de convergence pour des processus à temps discret et continu, notamment des méthodes de simulation et schémas d’Euler. Nous appliquons ensuite cette même théorie aux diffusions unidimensionnelles dans des potentiels, nous obtenons des résultats de convergence de diffusions ou marches aléatoires vers des diffusions singulières. Comme conséquences, nous déduisons la convergence de marches aléatoires en milieux aléatoires vers des diffusions en potentiels aléatoires. / In this PhD thesis, we give a space localisation for the theory of Feller processes. A first objective is to obtain simple and precise results on the convergence of Markov processes. A second objective is to study the link between the notions of Feller property, martingale problem and Skorokhod topology. First we give a localised version of the Skorokhod topology. We study the notions of compactness and tightness for this topology. We make the connexion between localised and unlocalised Skorokhod topologies, by using the notion of time change. In a second step, using the localised Skorokhod topology and the time change, we study martingale problems. We show the equivalence between, on the one hand, to be solution of a well-posed martingale problem, on the other hand, to satisfy a localised version of the Feller property, and finally, to be a Markov process weakly continuous with respect to the initial condition. We characterise the weak convergence for solutions of martingale problems in terms of convergence of associated operators and give a similar result for discrete time approximations. Finally, we apply the theory of locally Feller process to some examples. We first apply it to the Lévy-type processes and obtain convergence results for discrete and continuous time processes, including simulation methods and Euler’s schemes. We then apply the same theory to one-dimensional diffusions in a potential and we obtain convergence results of diffusions or random walks towards singular diffusions. As a consequences, we deduce the convergence of random walks in random environment towards diffusions in random potential.
|
89 |
State-similarity metrics for continuous Markov decision processesFerns, Norman Francis January 2007 (has links)
No description available.
|
90 |
Asymptotic behavior of stochastic systems possessing Markovian realizationsMeyn, S. P. (Sean P.) January 1987 (has links)
No description available.
|
Page generated in 0.0718 seconds