• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • Tagged with
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Bayesian approach to identifying and interpreting regional convergence clubs in Europe

Fischer, Manfred M., LeSage, James P. 10 1900 (has links) (PDF)
This study suggests a two-step approach to identifying and interpreting regional convergence clubs in Europe. The first step involves identifying the number and composition of clubs using a space-time panel data model for annual income growth rates in conjunction with Bayesian model comparison methods. A second step uses a Bayesian space-time panel data model to assess how changes in the initial endowments of variables (that explain growth) impact regional income levels over time. These dynamic trajectories of changes in regional income levels over time allow us to draw inferences regarding the timing and magnitude of regional income responses to changes in the initial conditions for the clubs that have been identified in the first step. This is in contrast to conventional practice that involves setting the number of clubs ex ante, selecting the composition of the potential convergence clubs according to some a priori criterion (such as initial per capita income thresholds for example), and using cross-sectional growth regressions for estimation and interpretation purposes. (authors' abstract)
2

Détection et caractérisation d’exoplanètes : développement et exploitation du banc d’interférométrie annulante Nulltimate et conception d’un système automatisé de classement des transits détectés par CoRoT / Detection and characterisation of exoplanets : development and operation of the nulling interferometer testbed Nulltimate and design of an automated software for the ranking of transit candidates detected by CoRoT

Demangeon, Olivier 28 June 2013 (has links)
Parmi les méthodes qui permettent de détecter des exoplanètes, la photométrie des transits est celle qui a connu le plus grand essor ces dernières années grâce à l’arrivée des télescopes spatiaux CoRoT (en 2006) puis Kepler (en 2009). Ces deux satellites ont permis de détecter des milliers de transits potentiellement planétaires. Étant donnés leur nombre et l’effort nécessaire à la confirmation de leur nature, il est essentiel d’effectuer, à partir des données photométriques, un classement efficace permettant d’identifier les transits les plus prometteurs et qui soit réalisable en un temps raisonnable. Pour ma thèse, j’ai développé un outil logiciel, rapide et automatisé, appelé BART (Bayesian Analysis for the Ranking of Transits) qui permet de réaliser un tel classement grâce une estimation de la probabilité que chaque transit soit de nature planétaire. Pour cela, mon outil s’appuie notamment sur le formalisme bayésien des probabilités et l’exploration de l’espace des paramètres libres par méthode de Monte Carlo avec des chaînes de Markov (mcmc).Une fois les exoplanètes détectées, l’étape suivante consiste à les caractériser. L’étude du système solaire nous a démontré, si cela était nécessaire, que l’information spectrale est un point clé pour comprendre la physique et l’histoire d’une planète. L’interférométrie annulante est une solution technologique très prometteuse qui pourrait permettre cela. Pour ma thèse, j’ai travaillé sur le banc optique Nulltimate afin d’étudier la faisabilité de certains objectifs technologiques liés à cette technique. Au-delà de la performance d’un taux d’extinction de 3,7.10^-5 en monochromatique et de 6,3.10^-4 en polychromatique dans l’infrarouge proche, ainsi qu’une stabilité de σN30 ms = 3,7.10^-5 estimée sur 1 heure, mon travail a permis d’assainir la situation en réalisant un budget d’erreur détaillé, une simulation en optique gaussienne de la transmission du banc et une refonte complète de l’informatique de commande. Tout cela m’a finalement permis d’identifier les faiblesses de Nulltimate. / From all exoplanet detection methods, transit photometry went through the quickest growth over the last few years thanks to the two space telescopes, CoRoT (in 2006) and Kepler (in 2009). These two satellites have identified thousands of potentially planetary transits. Given the number of detected transits and the effort required to demonstrate their natures, it is essential to perform, from photometric data only, a ranking allowing to efficiently identify the most promising transits within a reasonable period of time. For my thesis, I have developed a quick and automated software called bart (Bayesian Analysis for the Ranking of Transits) which realizes such a ranking thanks to the estimation of the probability regarding the planetary nature of each transit. For this purpose, I am relying on the Bayesian framework and free parameter space exploration with Markov Chain Monte Carlo (mcmc) methods.Once you have detected exoplanets, the following step is to characterise them. The study of the solar system demonstrated, if it was necessary, that the spectral information is a crucial clue for the understanding of the physics and history of a planet. Nulling interferometry is a promising solution which could make this possible. For my thesis, I worked on the optical bench Nulltimate in order to study the feasibility of certain technological requirements associated with this technique. Beyond the obtention of a nulling ratio of 3,7.10^-5 in monochromatic light and 6,3.10^-4 in polychromatic light in the near infrared, as well as a stability of σN30 ms = 3,7.10^-5 estimated on 1 hour, my work allowed to clarify the situation thanks to a detailed error budget, a simulation of the transmission based on Gaussian beam optics and a complete overhaul of the computer control system. All of this finally resulted in the identification of the weaknesses of Nulltimate.
3

Comparative Analysis of Behavioral Models for Adaptive Learning in Changing Environments

Marković, Dimitrije, Kiebel, Stefan J. 16 January 2017 (has links) (PDF)
Probabilistic models of decision making under various forms of uncertainty have been applied in recent years to numerous behavioral and model-based fMRI studies. These studies were highly successful in enabling a better understanding of behavior and delineating the functional properties of brain areas involved in decision making under uncertainty. However, as different studies considered different models of decision making under uncertainty, it is unclear which of these computational models provides the best account of the observed behavioral and neuroimaging data. This is an important issue, as not performing model comparison may tempt researchers to over-interpret results based on a single model. Here we describe how in practice one can compare different behavioral models and test the accuracy of model comparison and parameter estimation of Bayesian and maximum-likelihood based methods. We focus our analysis on two well-established hierarchical probabilistic models that aim at capturing the evolution of beliefs in changing environments: Hierarchical Gaussian Filters and Change Point Models. To our knowledge, these two, well-established models have never been compared on the same data. We demonstrate, using simulated behavioral experiments, that one can accurately disambiguate between these two models, and accurately infer free model parameters and hidden belief trajectories (e.g., posterior expectations, posterior uncertainties, and prediction errors) even when using noisy and highly correlated behavioral measurements. Importantly, we found several advantages of Bayesian inference and Bayesian model comparison compared to often-used Maximum-Likelihood schemes combined with the Bayesian Information Criterion. These results stress the relevance of Bayesian data analysis for model-based neuroimaging studies that investigate human decision making under uncertainty.
4

Comparative Analysis of Behavioral Models for Adaptive Learning in Changing Environments

Marković, Dimitrije, Kiebel, Stefan J. 16 January 2017 (has links)
Probabilistic models of decision making under various forms of uncertainty have been applied in recent years to numerous behavioral and model-based fMRI studies. These studies were highly successful in enabling a better understanding of behavior and delineating the functional properties of brain areas involved in decision making under uncertainty. However, as different studies considered different models of decision making under uncertainty, it is unclear which of these computational models provides the best account of the observed behavioral and neuroimaging data. This is an important issue, as not performing model comparison may tempt researchers to over-interpret results based on a single model. Here we describe how in practice one can compare different behavioral models and test the accuracy of model comparison and parameter estimation of Bayesian and maximum-likelihood based methods. We focus our analysis on two well-established hierarchical probabilistic models that aim at capturing the evolution of beliefs in changing environments: Hierarchical Gaussian Filters and Change Point Models. To our knowledge, these two, well-established models have never been compared on the same data. We demonstrate, using simulated behavioral experiments, that one can accurately disambiguate between these two models, and accurately infer free model parameters and hidden belief trajectories (e.g., posterior expectations, posterior uncertainties, and prediction errors) even when using noisy and highly correlated behavioral measurements. Importantly, we found several advantages of Bayesian inference and Bayesian model comparison compared to often-used Maximum-Likelihood schemes combined with the Bayesian Information Criterion. These results stress the relevance of Bayesian data analysis for model-based neuroimaging studies that investigate human decision making under uncertainty.

Page generated in 0.1046 seconds