601 |
Improved State Estimation For Jump Markov Linear SystemsOrguner, Umut 01 December 2006 (has links) (PDF)
This thesis presents a comprehensive example
framework on how current multiple model state estimation algorithms for jump Markov linear systems can be improved. The possible improvements are categorized as:
-Design of multiple model state estimation algorithms using
new criteria.
-Improvements obtained using existing multiple model state
estimation algorithms.
In the first category, risk-sensitive estimation is proposed for
jump Markov linear systems. Two types of cost functions namely, the instantaneous and cumulative cost functions related with risk-sensitive estimation are examined and for each one, the
corresponding multiple model estate estimation algorithm is derived.
For the cumulative cost function, the derivation involves the
reference probability method where one defines and uses a new
probability measure under which the involved processes has
independence properties. The performance of the proposed
risk-sensitive filters are illustrated and compared with
conventional algorithms using simulations.
The thesis addresses the second category of improvements by
proposing
-Two new online transition probability estimation schemes for
jump Markov linear systems.
-A mixed multiple model state estimation scheme which combines
desirable properties of two different multiple model state
estimation methods.
The two online transition probability estimators proposed use the
recursive Kullback-Leibler (RKL) procedure and the maximum
likelihood (ML) criteria to derive the corresponding identification
schemes. When used in state estimation, these methods result in an average error decrease in the root mean square (RMS) state
estimation errors, which is proved using simulation studies.
The mixed multiple model estimation procedure
which utilizes the analysis of the single Gaussian approximation of Gaussian mixtures in Bayesian filtering, combines IMM (Interacting Multiple Model) filter and GPB2 (2nd Order Generalized Pseudo Bayesian) filter efficiently. The resulting algorithm reaches the performance of GPB2 with less Kalman filters.
|
602 |
Hydro-climatic forecasting using sea surface temperaturesChen, Chia-Jeng 20 June 2012 (has links)
A key determinant of atmospheric circulation patterns and regional climatic conditions is sea surface temperature (SST). This has been the motivation for the development of various teleconnection methods aiming to forecast hydro-climatic variables. Among such methods are linear projections based on teleconnection gross indices (such as the ENSO, IOD, and NAO) or leading empirical orthogonal functions (EOFs). However, these methods deteriorate drastically if the predefined indices or EOFs cannot account for climatic variability in the region of interest. This study introduces a new hydro-climatic forecasting method that identifies SST predictors in the form of dipole structures. An SST dipole that mimics major teleconnection patterns is defined as a function of average SST anomalies over two oceanic areas of appropriate sizes and geographic locations. The screening process of SST-dipole predictors is based on an optimization algorithm that sifts through all possible dipole configurations (with progressively refined data resolutions) and identifies dipoles with the strongest teleconnection to the external hydro-climatic series. The strength of the teleconnection is measured by the Gerrity Skill Score. The significant dipoles are cross-validated and used to generate ensemble hydro-climatic forecasts. The dipole teleconnection method is applied to the forecasting of seasonal precipitation over the southeastern US and East Africa, and the forecasting of streamflow-related variables in the Yangtze and Congo Rivers. These studies show that the new method is indeed able to identify dipoles related to well-known patterns (e.g., ENSO and IOD) as well as to quantify more prominent predictor-predictand relationships at different lead times. Furthermore, the dipole method compares favorably with existing statistical forecasting schemes. An operational forecasting framework to support better water resources management through coupling with detailed hydrologic and water resources models is also demonstrated.
|
603 |
The development and assessment of techniques for daily rainfall disaggregation in South Africa.Knoesen, Darryn Marc. January 2005 (has links)
The temporal distribution of rainfall , viz. the distribution of rainfall intensity during a storm, is an important factor affecting the timing and magnitude of peak flow from a catchment and hence the flood-generating potential of rainfall events. It is also one of the primary inputs into hydrological models used for hydraulic design purposes. The use of short duration rainfall data inherently accounts for the temporal distribution of rainfall, however, there is a relative paucity of short duration data when compared to the more abundantly available daily data. One method of overcoming this is to disaggregate courser-scale data to a finer resolution, e.g. daily to hourly. A daily to hourly rainfall disaggregation model developed by Boughton (2000b) in Australia has been modified and applied in South Africa. The primary part of the model is the . distribution of R, which is the fraction of the daily total that occurs in the hour of maximum rainfall. A random number is used to sample from the distribution of R at the site of interest. The sample value of R determines the other 23 values, which then undergo a clustering procedure. This clustered sequence is then arranged into 1 of 24 possible temporal arrangements, depending when the hour the maximum rainfall occurs. The structure of the model allows for the production of 480 different temporal distributions with variation between uniform and non-uniform rainfall. The model was then regionalised to allow for application at sites where daily rainfall data, but no short duration data, were available. The model was evaluated at 15 different locations in differing climatic regions in South Africa. At each location, observed hourly rainfall data were aggregated to yield 24-hour values and these were then disaggregated using the methodology. Results show that the model is able to retain the daily total and most of the characteristics of the hourly rainfall at the site, for when both at-site and regional information are used. The model, however, is less capable of simulating statistics related to the sequencing of hourly rainfalls, e.g. autocorrelations. The model also tends to over-estimate design rainfalls, particularly for the shorter durations . / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2005.
|
604 |
Linear and non-linear boundary crossing probabilities for Brownian motion and related processesWu, Tung-Lung Jr 12 1900 (has links)
We propose a simple and general method to obtain the boundary crossing probability
for Brownian motion. This method can be easily extended to higher dimensional
of Brownian motion. It also covers certain classes of stochastic processes associated
with Brownian motion. The basic idea of the method is based on being able to
construct a nite Markov chain such that the boundary crossing probability of
Brownian motion is obtained as the limiting probability of the nite Markov chain
entering a set of absorbing states induced by the boundary. Numerical results are
given to illustrate our method.
|
605 |
Stable iterated function systemsGadde, Erland January 1992 (has links)
The purpose of this thesis is to generalize the growing theory of iterated function systems (IFSs). Earlier, hyperbolic IFSs with finitely many functions have been studied extensively. Also, hyperbolic IFSs with infinitely many functions have been studied. In this thesis, more general IFSs are studied. The Hausdorff pseudometric is studied. This is a generalization of the Hausdorff metric. Wide and narrow limit sets are studied. These are two types of limits of sequences of sets in a complete pseudometric space. Stable Iterated Function Systems, a kind of generalization of hyperbolic IFSs, are defined. Some different, but closely related, types of stability for the IFSs are considered. It is proved that the IFSs with the most general type of stability have unique attractors. Also, invariant sets, addressing, and periodic points for stable IFSs are studied. Hutchinson’s metric (also called Vaserhstein’s metric) is generalized from being defined on a space of probability measures, into a class of norms, the £-norms, on a space of real measures (on certain metric spaces). Under rather general conditions, it is proved that these norms, when they are restricted to positive measures, give rise to complete metric spaces with the metric topology coinciding with the weak*-topology. Then, IFSs with probabilities (IFSPs) are studied, in particular, stable IFSPs. The £-norm-results are used to prove that, as in the case of hyperbolic IFSPs, IFSPs with the most general kind of stability have unique invariant measures. These measures are ”attractive”. Also, an invariant measure is constructed by first ”lifting” the IFSP to the code space. Finally, it is proved that the Random Iteration Algorithm in a sense will ”work” for some stable IFSPs. / <p>Diss. Umeå : Umeå universitet, 1992</p> / digitalisering@umu
|
606 |
Linear and non-linear boundary crossing probabilities for Brownian motion and related processesWu, Tung-Lung Jr 12 1900 (has links)
We propose a simple and general method to obtain the boundary crossing probability
for Brownian motion. This method can be easily extended to higher dimensional
of Brownian motion. It also covers certain classes of stochastic processes associated
with Brownian motion. The basic idea of the method is based on being able to
construct a nite Markov chain such that the boundary crossing probability of
Brownian motion is obtained as the limiting probability of the nite Markov chain
entering a set of absorbing states induced by the boundary. Numerical results are
given to illustrate our method.
|
607 |
Analysis Of Stochastic And Non-stochastic Volatility ModelsOzkan, Pelin 01 September 2004 (has links) (PDF)
Changing in variance or volatility with time can be modeled as deterministic by using autoregressive conditional heteroscedastic (ARCH) type models, or as stochastic by using stochastic volatility (SV) models. This study compares these two kinds of models which are estimated on Turkish / USA exchange rate data. First, a GARCH(1,1) model is fitted to the data by using the package E-views and then a Bayesian estimation procedure is used for estimating an appropriate SV model with the help of Ox code. In order to compare these models, the LR test statistic calculated for non-nested hypotheses is obtained.
|
608 |
封閉式等候網路機率分配之估計與分析 / Estimation of Probability Distributions on Closed Queueing Networks莊依文 Unknown Date (has links)
在這一篇論文裡,我們討論兩個階段的封閉式等候線網路,其中服務時間的機率分配都是Phase type分配。我們猜測服務時間的機率分配和離開時間間隔的機率分配滿足一組聯立方程組。然後,我們推導出非邊界狀態的穩定機率可以被表示成 product-form的線性組合,而每個product-form可以用聯立方程組的根來構成。利用非邊界狀態的穩定機率, 我們可以求出邊界狀態的機率。最後我們建立一個求穩定機率的演算過程。利用這個演算方法,可以簡化求穩定機率的複雜度。 / In this thesis, we are concerned with the property of a two-stage closed system in which the service times are identically of phase type. We first conjecture that the Laplace-Stieltjes Transforms (LST) of service time distributions may satisfy a system of equations. Then we present that the stationary probabilities on the unboundary states can be written as a linear combination of product-forms. Each component of these products can be expressed in terms of roots of the system of equations. Finally, we establish an algorithm to obtain all the stationary probabilities. The algorithm is expected to work well for relatively large customers in the system.
|
609 |
Modélisation hiérarchique bayésienne des amas stellaires jeunes / Bayesian hierarchical modelling of young stellar clustersOlivares Romero, Javier 19 October 2017 (has links)
Il semble maintenant établi que la majorité des étoiles se forment dans des amas (Carpenter 2000; Porras et al. 2003; Lada & Lada 2003). Comprendre l'origine et l'évolution des populations stellaires est donc l'un des plus grands défis de l'astrophysique moderne. Malheureusement, moins d'un dixième de ces amas restent gravitationellement liés au delà de quelques centaines de millions d'années (Lada & Lada 2003). L’étude des amas stellaires doit donc se faire avant leur dissolution dans la galaxie.Le projet Dynamical Analysis of Nearby Clusters (DANCe, Bouy et al. 2013), dont le travail fait partie, fournit le cadre scientifique pour l'analyse des amas proches et jeunes (NYC) dans le voisinage solaire. Les observations de l'amas ouvert des Pléiades par le projet DANCe offrent une opportunité parfaite pour le développement d'outils statistiques visant à analyser les premières phases de l'évolution des amas.L'outil statistique développé ici est un système intelligent probabiliste qui effectue une inférence bayésienne des paramètres régissant les fonctions de densité de probabilité (PDF) de la population de l'amas (PDFCP). Il a été testé avec les données photométriques et astrométriques des Pléiades du relevé DANCe. Pour éviter la subjectivité de ces choix des priors, le système intelligent les établit en utilisant l'approche hiérarchique bayésienne (BHM). Dans ce cas, les paramètres de ces distributions, qui sont également déduits des données, proviennent d'autres distributions de manière hiérarchique.Dans ce système intelligent BHM, les vraies valeurs du PDFCP sont spécifiées par des relations stochastiques et déterministes représentatives de notre connaissance des paramètres physiques de l'amas. Pour effectuer l'inférence paramétrique, la vraisemblance (compte tenu de ces valeurs réelles), tient en compte des propriétés de l'ensemble de données, en particulier son hétéroscédasticité et des objects avec des valeurs manquantes.Le BHM obtient les PDF postérieures des paramètres dans les PDFCP, en particulier celles des distributions spatiales, de mouvements propres et de luminosité, qui sont les objectifs scientifiques finaux du projet DANCe. Dans le BHM, chaque étoile du catalogue contribue aux PDF des paramètres de l'amas proportionnellement à sa probabilité d'appartenance. Ainsi, les PDFCP sont exempts de biais d'échantillonnage résultant de sélections tronquées au-dessus d'un seuil de probabilité défini plus ou moins arbitrairement.Comme produit additionnel, le BHM fournit également les PDF de la probabilité d'appartenance à l'amas pour chaque étoile du catalogue d'entrée, qui permettent d'identifier les membres probables de l'amas, et les contaminants probables du champ. La méthode a été testée avec succès sur des ensembles de données synthétiques (avec une aire sous la courbe ROC de 0,99), ce qui a permis d'estimer un taux de contamination pour les PDFCP de seulement 5,8 %.Ces nouvelles méthodes permettent d'obtenir et/ou de confirmer des résultats importants sur les propriétés astrophysiques de l'amas des Pléiades. Tout d'abord, le BHM a découvert 200 nouveaux candidats membres, qui représentent 10% de la population totale de l'amas. Les résultats sont en excellent accord (99,6% des 100 000 objets dans l'ensemble de données) avec les résultats précédents trouvés dans la littérature, ce qui fournit une validation externe importante de la méthode. Enfin, la distribution de masse des systèmes actuelle (PDSMD) est en général en bon accord avec les résultats précédents de Bouy et al. 2015, mais présente l'avantage inestimable d'avoir des incertitudes beaucoup plus robustes que celles des méthodes précédentes.Ainsi, en améliorant la modélisation de l'ensemble de données et en éliminant les restrictions inutiles ou les hypothèses simplificatrices, le nouveau système intelligent, développé et testé dans le présent travail, représente l'état de l'art pour l'analyse statistique des populations de NYC. / The origin and evolution of stellar populations is one of the greatest challenges in modern astrophysics. It is known that the majority of the stars has its origin in stellar clusters (Carpenter 2000; Porras et al. 2003; Lada & Lada 2003). However, only less than one tenth of these clusters remains bounded after the first few hundred million years (Lada & Lada 2003). Ergo, the understanding of the origin and evolution of stars demands meticulous analyses of stellar clusters in these crucial ages.The project Dynamical Analysis of Nearby Clusters (DANCe, Bouy et al. 2013), from which the present work is part of, provides the scientific framework for the analysis of Nearby Young Clusters (NYC) in the solar neighbourhood (< 500 pc). The DANCe carefully designed observations of the well known Pleiades cluster provide the perfect case study for the development and testing of statistical tools aiming at the analysis of the early phases of cluster evolution.The statistical tool developed here is a probabilistic intelligent system that performs Bayesian inference for the parameters governing the probability density functions (PDFs) of the cluster population (PDFCP). It has been benchmarked with the Pleiades photometric and astrometric data of the DANCe survey. As any Bayesian framework, it requires the setting up of priors. To avoid the subjectivity of these, the intelligent system establish them using the Bayesian Hierarchical Model (BHM) approach. In it, the parameters of prior distributions, which are also inferred from the data, are drawn from other distributions in a hierarchical way.In this BHM intelligent system, the true values of the PDFCP are specified by stochastic and deterministic relations representing the state of knowledge of the NYC. To perform the parametric inference, the likelihood of the data, given these true values, accounts for the properties of the data set, especially its heteroscedasticity and missing value objects. By properly accounting for these properties, the intelligent system: i) Increases the size of the data set, with respect to previous studies working exclusively on fully observed objects, and ii) Avoids biases associated to fully observed data sets, and restrictions to low-uncertainty objects (sigma-clipping procedures).The BHM returns the posterior PDFs of the parameters in the PDFCPs, particularly of the spatial, proper motions and luminosity distributions. In the BHM each object in the data set contributes to the PDFs of the parameters proportionally to its likelihood. Thus, the PDFCPs are free of biases resulting from typical high membership probability selections (sampling bias).As a by-product, the BHM also gives the PDFs of the cluster membership probability for each object in the data set. These PDFs together with an optimal probability classification threshold, which is obtained from synthetic data sets, allow the classification of objects into cluster and field populations. This by-product classifier shows excellent results when applied on synthetic data sets (with an area under the ROC curve of 0.99). From the analysis of synthetic data sets, the expected value of the contamination rate for the PDFCPs is 5.8 ± 0.2%.The following are the most important astrophysical results of the BHM applied tothe Pleiades cluster. First, used as a classifier, it finds ∼ 200 new candidate members, representing 10% new discoveries. Nevertheless, it shows outstanding agreement (99.6% of the 105 objects in the data set) with previous results from the literature. Second, the derived present day system mass distribution (PDSMD) is in general agreement with the previous results of Bouy et al. (2015).Thus, by better modelling the data set and eliminating unnecessary restrictions to it, the new intelligent system, developed and tested in the present work, represents the state of the art for the statistical analysis of NYC populations.
|
610 |
Nuclear reactions inside the water moleculeDicks, Jesse 30 June 2005 (has links)
A scheme, analogous to the linear combination of atomic orbitals (LCAO), is used
to calculate rates of reactions for the fusion of nuclei con¯ned in molecules. As an
example, the possibility of nuclear fusion in rotationally excited H2O molecules of
angular momentum 1¡ is estimated for the p + p + 16O ! 18Ne¤(4:522; 1¡) nuclear
transition. Due to a practically exact agreement of the energy of the Ne resonance
and of the p + p + 16O threshold, the possibility of an enhanced transition probability
is investigated. / Physics / M.Sc.
|
Page generated in 0.0519 seconds