• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 23
  • 9
  • 1
  • 1
  • Tagged with
  • 34
  • 34
  • 34
  • 12
  • 10
  • 10
  • 9
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

On some special-purpose hidden Markov models / Einige Erweiterungen von Hidden Markov Modellen für spezielle Zwecke

Langrock, Roland 28 April 2011 (has links)
No description available.
22

Automatische Erkennung von Zuständen in Anthropomatiksystemen

Moldenhauer, Jörg January 2005 (has links)
Zugl.: Karlsruhe, Univ., Diss., 2005
23

Automatische Erkennung von Zuständen in Anthropomatiksystemen

Moldenhauer, Jörg. January 2006 (has links)
Universiẗat, Diss., 2005--Karlsruhe.
24

Training of Hidden Markov models as an instance of the expectation maximization algorithm

Majewsky, Stefan 27 July 2017 (has links) (PDF)
In Natural Language Processing (NLP), speech and text are parsed and generated with language models and parser models, and translated with translation models. Each model contains a set of numerical parameters which are found by applying a suitable training algorithm to a set of training data. Many such training algorithms are instances of the Expectation-Maximization (EM) algorithm. In [BSV15], a generic EM algorithm for NLP is described. This work presents a particular speech model, the Hidden Markov model, and its standard training algorithm, the Baum-Welch algorithm. It is then shown that the Baum-Welch algorithm is an instance of the generic EM algorithm introduced by [BSV15], from which follows that all statements about the generic EM algorithm also apply to the Baum-Welch algorithm, especially its correctness and convergence properties.
25

Improvement of the jpHMM approach to recombination detection in viral genomes and its application to HIV and HBV / Verbesserung des jpHMM-Ansatzes zur Rekombinationsvorhersage in viralen Genomen und dessen Anwendung auf HIV und HBV

Schultz, Anne-Kathrin 27 April 2011 (has links)
No description available.
26

Application of Hidden Markov and Hidden Semi-Markov Models to Financial Time Series / Application of Hidden Markov and Hidden Semi-Markov Models to Financial Time Series

Bulla, Jan 06 July 2006 (has links)
No description available.
27

Models of Discrete-Time Stochastic Processes and Associated Complexity Measures / Modelle stochastischer Prozesse in diskreter Zeit und zugehörige Komplexitätsmaße

Löhr, Wolfgang 24 June 2010 (has links) (PDF)
Many complexity measures are defined as the size of a minimal representation in a specific model class. One such complexity measure, which is important because it is widely applied, is statistical complexity. It is defined for discrete-time, stationary stochastic processes within a theory called computational mechanics. Here, a mathematically rigorous, more general version of this theory is presented, and abstract properties of statistical complexity as a function on the space of processes are investigated. In particular, weak-* lower semi-continuity and concavity are shown, and it is argued that these properties should be shared by all sensible complexity measures. Furthermore, a formula for the ergodic decomposition is obtained. The same results are also proven for two other complexity measures that are defined by different model classes, namely process dimension and generative complexity. These two quantities, and also the information theoretic complexity measure called excess entropy, are related to statistical complexity, and this relation is discussed here. It is also shown that computational mechanics can be reformulated in terms of Frank Knight's prediction process, which is of both conceptual and technical interest. In particular, it allows for a unified treatment of different processes and facilitates topological considerations. Continuity of the Markov transition kernel of a discrete version of the prediction process is obtained as a new result.
28

Nichtlineare Regimewechselmodelle : theoretische und empirische Evidenz am deutschen Kapitalmarkt /

Brannolte, Cord. January 2002 (has links) (PDF)
Univ., Diss.--Kiel, 2001.
29

Uncertainty and the stability of financial markets in open economies : empirical evidence from regime-switching models /

Tillmann, Peter. January 2003 (has links) (PDF)
Univ, Diss--Köln, 2003.
30

Dynamical characterization of Markov processes with varying order

Bauer, Michael 01 July 2008 (has links)
Time-delayed actions appear as an essential component of numerous systems especially in evolution processes, natural phenomena, and particular technical applications and are associated with the existence of a memory. Under common conditions, external forces or state dependent parameters modify the length of the delay with time. Consequently, an altered dynamical behavior emerges, whose characterization is compulsory for a deeper understanding of these processes. In this thesis, the well-investigated class of time-homogeneous finite-state Markov processes is utilized to establish a variation of memory length by combining a first-order Markov chain with a memoryless Markov chain of order zero. The fluctuations induce a non-stationary process, which is accomplished for two special cases: a periodic and a random selection of the available Markov chains. For both cases, the Kolmogorov-Sinai entropy as a characteristic property is deduced analytically and compared to numerical approximations to the entropy rate of related symbolic dynamics. The convergences of per-symbol and conditional entropies are examined in order to recognize their behavior when identifying unknown processes. Additionally, the connection from Markov processes with varying memory length to hidden Markov models is illustrated enabling further analysis. Hence, the Kolmogorov-Sinai entropy of hidden Markov chains is calculated by means of Blackwell’s entropy rate involving Blackwell’s measure. These results are used to verify the previous computations.

Page generated in 0.0714 seconds