• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 3
  • 2
  • Tagged with
  • 22
  • 22
  • 22
  • 11
  • 11
  • 10
  • 8
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A Bayesian approach to initial model inference in cryo-electron microscopy

Joubert, Paul 04 March 2016 (has links)
Eine Hauptanwendung der Einzelpartikel-Analyse in der Kryo-Elektronenmikroskopie ist die Charakterisierung der dreidimensionalen Struktur makromolekularer Komplexe. Dazu werden zehntausende Bilder verwendet, die verrauschte zweidimensionale Projektionen des Partikels zeigen. Im ersten Schritt werden ein niedrig aufgelöstetes Anfangsmodell rekonstruiert sowie die unbekannten Bildorientierungen geschätzt. Dies ist ein schwieriges inverses Problem mit vielen Unbekannten, einschließlich einer unbekannten Orientierung für jedes Projektionsbild. Ein gutes Anfangsmodell ist entscheidend für den Erfolg des anschließenden Verfeinerungsschrittes. Meine Dissertation stellt zwei neue Algorithmen zur Rekonstruktion eines Anfangsmodells in der Kryo-Elektronenmikroskopie vor, welche auf einer groben Darstellung der Elektronendichte basieren. Die beiden wesentlichen Beiträge meiner Arbeit sind zum einen das Modell, welches die Elektronendichte darstellt, und zum anderen die neuen Rekonstruktionsalgorithmen. Der erste Hauptbeitrag liegt in der Verwendung Gaußscher Mischverteilungen zur Darstellung von Elektrondichten im Rekonstruktionsschritt. Ich verwende kugelförmige Mischungskomponenten mit unbekannten Positionen, Ausdehnungen und Gewichtungen. Diese Darstellung hat viele Vorteile im Vergleich zu einer gitterbasierten Elektronendichte, die andere Rekonstruktionsalgorithmen üblicherweise verwenden. Zum Beispiel benötigt sie wesentlich weniger Parameter, was zu schnelleren und robusteren Algorithmen führt. Der zweite Hauptbeitrag ist die Entwicklung von Markovketten-Monte-Carlo-Verfahren im Rahmen eines Bayes'schen Ansatzes zur Schätzung der Modellparameter. Der erste Algorithmus kann aus dem Gibbs-Sampling, welches Gaußsche Mischverteilungen an Punktwolken anpasst, abgeleitet werden. Dieser Algorithmus wird hier so erweitert, dass er auch mit Bildern, Projektionen sowie unbekannten Drehungen und Verschiebungen funktioniert. Der zweite Algorithmus wählt einen anderen Zugang. Das Vorwärtsmodell nimmt nun Gaußsche Fehler an. Sampling-Algorithmen wie Hamiltonian Monte Carlo (HMC) erlauben es, die Positionen der Mischungskomponenten und die Bildorientierungen zu schätzen. Meine Dissertation zeigt umfassende numerische Experimente mit simulierten und echten Daten, die die vorgestellten Algorithmen in der Praxis testen und mit anderen Rekonstruktionsverfahren vergleichen.
12

An application of Bayesian Hidden Markov Models to explore traffic flow conditions in an urban area

Andersson, Lovisa January 2019 (has links)
This study employs Bayesian Hidden Markov Models as method to explore vehicle traffic flow conditions in an urban area in Stockholm, based on sensor data from separate road positions. Inter-arrival times are used as the observed sequences. These sequences of inter-arrival times are assumed to be generated from the distributions of four different (and hidden) traffic flow states; nightly free flow, free flow, mixture and congestion. The filtered and smoothed probability distributions of the hidden states and the most probable state sequences are obtained by using the forward, forward-backward and Viterbi algorithms. The No-U-Turn sampler is used to sample from the posterior distributions of all unknown parameters. The obtained results show in a satisfactory way that the Hidden Markov Models can detect different traffic flow conditions. Some of the models have problems with divergence, but the obtained results from those models still show satisfactory results. In fact, two of the models that converged seemed to overestimate the presence of congested traffic and all the models that not converged seem to do adequate estimations of the probability of being in a congested state. Since the interest of this study lies in estimating the current traffic flow condition, and not in doing parameter inference, the model choice of Bayesian Hidden Markov Models is satisfactory. Due to the unsupervised nature of the problematization of this study, it is difficult to evaluate the accuracy of the results. However, a model with simulated data and known states was also implemented, which resulted in a high classification accuracy. This indicates that the choice of Hidden Markov Models is a good model choice for estimating traffic flow conditions.
13

Bayesian inference for compact binary sources of gravitational waves / Inférence Bayésienne pour les sources compactes binaires d’ondes gravitationnelles

Bouffanais, Yann 11 October 2017 (has links)
La première détection des ondes gravitationnelles en 2015 a ouvert un nouveau plan d'étude pour l'astrophysique des étoiles binaires compactes. En utilisant les données des détections faites par les détecteurs terrestres advanced LIGO et advanced Virgo, il est possible de contraindre les paramètres physiques de ces systèmes avec une analyse Bayésienne et ainsi approfondir notre connaissance physique des étoiles binaires compactes. Cependant, pour pouvoir être en mesure d'obtenir de tels résultats, il est essentiel d’avoir des algorithmes performants à la fois pour trouver les signaux de ces ondes gravitationnelles et pour l'estimation de paramètres. Le travail de cette thèse a ainsi été centré autour du développement d’algorithmes performants et adaptées au problème physique à la fois de la détection et de l'estimation des paramètres pour les ondes gravitationnelles. La plus grande partie de ce travail de thèse a ainsi été dédiée à l'implémentation d’un algorithme de type Hamiltonian Monte Carlo adapté à l'estimation de paramètres pour les signaux d’ondes gravitationnelles émises par des binaires compactes formées de deux étoiles à neutrons. L'algorithme développé a été testé sur une sélection de sources et a été capable de fournir de meilleures performances que d'autres algorithmes de type MCMC comme l'algorithme de Metropolis-Hasting et l'algorithme à évolution différentielle. L'implémentation d'un tel algorithme dans les pipelines d’analyse de données de la collaboration pourrait augmenter grandement l'efficacité de l'estimation de paramètres. De plus, il permettrait également de réduire drastiquement le temps de calcul nécessaire, ce qui est un facteur essentiel pour le futur où de nombreuses détections sont attendues. Un autre aspect de ce travail de thèse a été dédié à l'implémentation d'un algorithme de recherche de signaux gravitationnelles pour les binaires compactes monochromatiques qui seront observées par la future mission spatiale LISA. L'algorithme est une mixture de plusieurs algorithmes évolutionnistes, avec notamment l'inclusion d'un algorithme de Particle Swarm Optimisation. Cette algorithme a été testé dans plusieurs cas tests et a été capable de trouver toutes les sources gravitationnelles comprises dans un signal donné. De plus, l'algorithme a également été capable d'identifier des sources sur une bande de fréquence aussi grande que 1 mHz, ce qui n'avait pas été réalisé au moment de cette étude de thèse. / The first detection of gravitational waves in 2015 has opened a new window for the study of the astrophysics of compact binaries. Thanks to the data taken by the ground-based detectors advanced LIGO and advanced Virgo, it is now possible to constrain the physical parameters of compact binaries using a full Bayesian analysis in order to increase our physical knowledge on compact binaries. However, in order to be able to perform such analysis, it is essential to have efficient algorithms both to search for the signals and for parameter estimation. The main part of this thesis has been dedicated to the implementation of a Hamiltonian Monte Carlo algorithm suited for the parameter estimation of gravitational waves emitted by compact binaries composed of neutron stars. The algorithm has been tested on a selection of sources and has been able to produce better performances than other types of MCMC methods such as Metropolis-Hastings and Differential Evolution Monte Carlo. The implementation of the HMC algorithm in the data analysis pipelines of the Ligo/Virgo collaboration could greatly increase the efficiency of parameter estimation. In addition, it could also drastically reduce the computation time associated to the parameter estimation of such sources of gravitational waves, which will be of particular interest in the near future when there will many detections by the ground-based network of gravitational wave detectors. Another aspect of this work was dedicated to the implementation of a search algorithm for gravitational wave signals emitted by monochromatic compact binaries as observed by the space-based detector LISA. The developed algorithm is a mixture of several evolutionary algorithms, including Particle Swarm Optimisation. This algorithm has been tested on several test cases and has been able to find all the sources buried in a signal. Furthermore, the algorithm has been able to find the sources on a band of frequency as large as 1 mHz which wasn’t done at the time of this thesis study
14

Detecting Influential observations in spatial models using Bregman divergence / Detecção de observações influentes em modelos espaciais usando divergência de Bregman

Danilevicz, Ian Meneghel 26 February 2018 (has links)
How to evaluate if a spatial model is well ajusted to a problem? How to know if it is the best model between the class of conditional autoregressive (CAR) and simultaneous autoregressive (SAR) models, including homoscedasticity and heteroscedasticity cases? To answer these questions inside Bayesian framework, we propose new ways to apply Bregman divergence, as well as recent information criteria as widely applicable information criterion (WAIC) and leave-one-out cross-validation (LOO). The functional Bregman divergence is a generalized form of the well known Kullback-Leiber (KL) divergence. There is many special cases of it which might be used to identify influential points. All the posterior distributions displayed in this text were estimate by Hamiltonian Monte Carlo (HMC), a optimized version of Metropolis-Hasting algorithm. All ideas showed here were evaluate by both: simulation and real data. / Como avaliar se um modelo espacial está bem ajustado? Como escolher o melhor modelo entre muitos da classe autorregressivo condicional (CAR) e autorregressivo simultâneo (SAR), homoscedásticos e heteroscedásticos? Para responder essas perguntas dentro do paradigma bayesiano, propomos novas formas de aplicar a divergência de Bregman, assim como critérios de informação bastante recentes na literatura, são eles o widely applicable information criterion (WAIC) e validação cruzada leave-one-out (LOO). O funcional de Bregman é uma generalização da famosa divergência de Kullback-Leiber (KL). Há diversos casos particulares dela que podem ser usados para identificar pontos influentes. Todas as distribuições a posteriori apresentadas nesta dissertação foram estimadas usando Monte Carlo Hamiltoniano (HMC), uma versão otimizada do algoritmo Metropolis-Hastings. Todas as ideias apresentadas neste texto foram submetidas a simulações e aplicadas em dados reais.
15

Detecting Influential observations in spatial models using Bregman divergence / Detecção de observações influentes em modelos espaciais usando divergência de Bregman

Ian Meneghel Danilevicz 26 February 2018 (has links)
How to evaluate if a spatial model is well ajusted to a problem? How to know if it is the best model between the class of conditional autoregressive (CAR) and simultaneous autoregressive (SAR) models, including homoscedasticity and heteroscedasticity cases? To answer these questions inside Bayesian framework, we propose new ways to apply Bregman divergence, as well as recent information criteria as widely applicable information criterion (WAIC) and leave-one-out cross-validation (LOO). The functional Bregman divergence is a generalized form of the well known Kullback-Leiber (KL) divergence. There is many special cases of it which might be used to identify influential points. All the posterior distributions displayed in this text were estimate by Hamiltonian Monte Carlo (HMC), a optimized version of Metropolis-Hasting algorithm. All ideas showed here were evaluate by both: simulation and real data. / Como avaliar se um modelo espacial está bem ajustado? Como escolher o melhor modelo entre muitos da classe autorregressivo condicional (CAR) e autorregressivo simultâneo (SAR), homoscedásticos e heteroscedásticos? Para responder essas perguntas dentro do paradigma bayesiano, propomos novas formas de aplicar a divergência de Bregman, assim como critérios de informação bastante recentes na literatura, são eles o widely applicable information criterion (WAIC) e validação cruzada leave-one-out (LOO). O funcional de Bregman é uma generalização da famosa divergência de Kullback-Leiber (KL). Há diversos casos particulares dela que podem ser usados para identificar pontos influentes. Todas as distribuições a posteriori apresentadas nesta dissertação foram estimadas usando Monte Carlo Hamiltoniano (HMC), uma versão otimizada do algoritmo Metropolis-Hastings. Todas as ideias apresentadas neste texto foram submetidas a simulações e aplicadas em dados reais.
16

Evaluation of Probabilistic Programming Frameworks

Munkby, Carl January 2022 (has links)
In recent years significant progress has been made in the area of Probabilistic Programming, contributing to a considerably easier workflow for quantitative research in many fields. However, as new Probabilistic Programming Frameworks (PPFs) are continuously being created and developed, there is a need for finding ways of evaluating and benchmarking these frameworks. To this end, this thesis explored the use of a range of evaluation measures to evaluate and better understand the performance of three PPFs: Stan, NumPyro and TensorFlow Probability (TFP). Their respective Hamiltonian Monte Carlo (HMC) samplers were benchmarked on three different hierarchical models using both centered and non-centered parametrizations. The results showed that even if the same inference algorithms were used, the PPFs’ samplers still exhibited different behaviours, which consequently lead to non-negligible differences in their statistical efficiency. Furthermore, the sampling behaviour of the PPFs indicated that the observed differences can possibly be attributed to how the warm-up phase used in HMC-sampling is constructed. Finally, this study concludes that the computational speed of the numerical library used, was the primary deciding factor of performance in this benchmark. This was demonstrated by NumPyros superior computational speed, contributing to it yielding up to 10x higher ESSmin/s than Stan and 4x higher ESSmin/s than TFP.
17

Parameter Recovery for the Four-Parameter Unidimensional Binary IRT Model: A Comparison of Marginal Maximum Likelihood and Markov Chain Monte Carlo Approaches

Do, Hoan 26 May 2021 (has links)
No description available.
18

Constrained Gaussian Process Regression Applied to the Swaption Cube / Regression för gaussiska processer med bivillkor tillämpad på Swaption-kuben

Deleplace, Adrien January 2021 (has links)
This document is a Master Thesis report in financial mathematics for KTH. This Master thesis is the product of an internship conducted at Nexialog Consulting, in Paris. This document is about the innovative use of Constrained Gaussian process regression in order to build an arbitrage free swaption cube. The methodology introduced in the document is used on a data set of European Swaptions Out of the Money. / Det här dokumentet är en magisteruppsats i finansiel matematik på KTH. Detta examensarbete är resultatet av en praktik som ufördes på Nexialog Consulting i Paris.Detta dokument handlar om den innovativa användningen av regression för gaussiska processer med bivillkor för att bygga en arbitragefri swaption kub. Den metodik som introduceras i dokumentet används på en datamängd av europeiska swaptions som är "Out of the Money".
19

Métodos de Monte Carlo Hamiltoniano na inferência Bayesiana não-paramétrica de valores extremos / Monte Carlo Hamiltonian methods in non-parametric Bayesian inference of extreme values

Hartmann, Marcelo 09 March 2015 (has links)
Neste trabalho propomos uma abordagem Bayesiana não-paramétrica para a modelagem de dados com comportamento extremo. Tratamos o parâmetro de locação μ da distribuição generalizada de valor extremo como uma função aleatória e assumimos um processo Gaussiano para tal função (Rasmussem & Williams 2006). Esta situação leva à intratabilidade analítica da distribuição a posteriori de alta dimensão. Para lidar com este problema fazemos uso do método Hamiltoniano de Monte Carlo em variedade Riemanniana que permite a simulação de valores da distribuição a posteriori com forma complexa e estrutura de correlação incomum (Calderhead & Girolami 2011). Além disso, propomos um modelo de série temporal autoregressivo de ordem p, assumindo a distribuição generalizada de valor extremo para o ruído e determinamos a respectiva matriz de informação de Fisher. No decorrer de todo o trabalho, estudamos a qualidade do algoritmo em suas variantes através de simulações computacionais e apresentamos vários exemplos com dados reais e simulados. / In this work we propose a Bayesian nonparametric approach for modeling extreme value data. We treat the location parameter μ of the generalized extreme value distribution as a random function following a Gaussian process model (Rasmussem & Williams 2006). This configuration leads to no closed-form expressions for the highdimensional posterior distribution. To tackle this problem we use the Riemannian Manifold Hamiltonian Monte Carlo algorithm which allows samples from the posterior distribution with complex form and non-usual correlation structure (Calderhead & Girolami 2011). Moreover, we propose an autoregressive time series model assuming the generalized extreme value distribution for the noise and obtained its Fisher information matrix. Throughout this work we employ some computational simulation studies to assess the performance of the algorithm in its variants and show many examples with simulated and real data-sets.
20

Métodos de Monte Carlo Hamiltoniano na inferência Bayesiana não-paramétrica de valores extremos / Monte Carlo Hamiltonian methods in non-parametric Bayesian inference of extreme values

Marcelo Hartmann 09 March 2015 (has links)
Neste trabalho propomos uma abordagem Bayesiana não-paramétrica para a modelagem de dados com comportamento extremo. Tratamos o parâmetro de locação μ da distribuição generalizada de valor extremo como uma função aleatória e assumimos um processo Gaussiano para tal função (Rasmussem & Williams 2006). Esta situação leva à intratabilidade analítica da distribuição a posteriori de alta dimensão. Para lidar com este problema fazemos uso do método Hamiltoniano de Monte Carlo em variedade Riemanniana que permite a simulação de valores da distribuição a posteriori com forma complexa e estrutura de correlação incomum (Calderhead & Girolami 2011). Além disso, propomos um modelo de série temporal autoregressivo de ordem p, assumindo a distribuição generalizada de valor extremo para o ruído e determinamos a respectiva matriz de informação de Fisher. No decorrer de todo o trabalho, estudamos a qualidade do algoritmo em suas variantes através de simulações computacionais e apresentamos vários exemplos com dados reais e simulados. / In this work we propose a Bayesian nonparametric approach for modeling extreme value data. We treat the location parameter μ of the generalized extreme value distribution as a random function following a Gaussian process model (Rasmussem & Williams 2006). This configuration leads to no closed-form expressions for the highdimensional posterior distribution. To tackle this problem we use the Riemannian Manifold Hamiltonian Monte Carlo algorithm which allows samples from the posterior distribution with complex form and non-usual correlation structure (Calderhead & Girolami 2011). Moreover, we propose an autoregressive time series model assuming the generalized extreme value distribution for the noise and obtained its Fisher information matrix. Throughout this work we employ some computational simulation studies to assess the performance of the algorithm in its variants and show many examples with simulated and real data-sets.

Page generated in 0.0804 seconds