• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 84
  • 26
  • 13
  • 12
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 161
  • 161
  • 26
  • 26
  • 24
  • 22
  • 21
  • 20
  • 19
  • 18
  • 18
  • 17
  • 16
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

General Adaptive Monte Carlo Bayesian Image Denoising

Zhang, Wen January 2010 (has links)
Image noise reduction, or denoising, is an active area of research, although many of the techniques cited in the literature mainly target additive white noise. With an emphasis on signal-dependent noise, this thesis presents the General Adaptive Monte Carlo Bayesian Image Denoising (GAMBID) algorithm, a model-free approach based on random sampling. Testing is conducted on synthetic images with two different signal-dependent noise types as well as on real synthetic aperture radar and ultrasound images. Results show that GAMBID can achieve state-of-the-art performance, but suffers from some limitations in dealing with textures and fine low-contrast features. These aspects can by addressed in future iterations when GAMBID is expanded to become a versatile denoising framework.
42

Empirical Evaluation of DSGE Models for Emerging Countries

Garcia Cicco, Javier January 2009 (has links)
<p>This dissertation is the collection of three essays aimed to evaluate the empirical performance of dynamic stochastic general equilibrium (DSGE) models in explaining the behavior of macroeconomic dynamics in emerging countries. </p><p>Chapter 1, which is joint work with M. Uribe and R. Pancrazzi, investigates the hypothesis that a real business cycles model driven by permanent and transitory productivity shocks can explain well observed business-cycle fluctuations in emerging countries. The model is estimated using more than a century of Argentine data. </p><p>In Chapter 2, a comprehensive real DSGE model of an emerging country is estimated using Bayesian techniques, expanding the data set used in Chapter 1. The goal is to characterize the relative relevance of ten different business cycles' drivers: three sectorial technology shocks, embodied and disembodied non-stationary technology, terms of trade, the world interest rate, trade policy, government expenditures and the country premium. </p><p>Finally, Chapter 3 estimates (using Mexican data) a DSGE model of an emerging country containing many frictions, as has been recently argued, that impose non-trivial constraints for monetary-policy design. In particular, the framework features a sectorial decomposition of the productive sector, intermediate inputs, imperfect pass-through, endogenous premium to finance capital accumulation, a liability-dollarization problem, currency substitution, price and wage rigidities, and dynamics driven by eleven shocks.</p> / Dissertation
43

Bayesian Inference In Anova Models

Ozbozkurt, Pelin 01 January 2010 (has links) (PDF)
Estimation of location and scale parameters from a random sample of size n is of paramount importance in Statistics. An estimator is called fully efficient if it attains the Cramer-Rao minimum variance bound besides being unbiased. The method that yields such estimators, at any rate for large n, is the method of modified maximum likelihood estimation. Apparently, such estimators cannot be made more efficient by using sample based classical methods. That makes room for Bayesian method of estimation which engages prior distributions and likelihood functions. A formal combination of the prior knowledge and the sample information is called posterior distribution. The posterior distribution is maximized with respect to the unknown parameter(s). That gives HPD (highest probability density) estimator(s). Locating the maximum of the posterior distribution is, however, enormously difficult (computationally and analytically) in most situations. To alleviate these difficulties, we use modified likelihood function in the posterior distribution instead of the likelihood function. We derived the HPD estimators of location and scale parameters of distributions in the family of Generalized Logistic. We have extended the work to experimental design, one way ANOVA. We have obtained the HPD estimators of the block effects and the scale parameter (in the distribution of errors) / they have beautiful algebraic forms. We have shown that they are highly efficient. We have given real life examples to illustrate the usefulness of our results. Thus, the enormous computational and analytical difficulties with the traditional Bayesian method of estimation are circumvented at any rate in the context of experimental design.
44

Role Of Investment Shocks In Explaining Business Cycles In Turkey

Yuksel, Canan 01 February 2012 (has links) (PDF)
This thesis aims to understand the sources of business cycles observed in Turkish economy. In particular the thesis investigates the role of investment shocks in explaining fluctuations in output. For this purpose a small open economy DSGE model is estimated on Turkish data for 2002-2011 period by Bayesian methods. Variance decomposition analysis shows that permanent technology shock is the key driving force of business cycles in Turkish economy and the role of investment shock is less spelled.
45

Three Essays on Credit Risk Models and Their Bayesian Estimation

Kwon, Tae Yeon 24 July 2012 (has links)
This dissertation consists of three essays on credit risk models and their Bayesian estimation. In each essay, defaults or default correlation models are built under one of two main streams. In our first essay, sequential estimation on hidden asset value and model parameters estimation are implemented under the Black-Cox model. To capture short-term autocorrelation in the stock market, we assume that market noise follows a mean reverting process. For estimation, two Bayesian methods are applied in this essay: the particle filter algorithm for sequential estimation of asset value and the generalized Gibbs sampling method for model parameters estimation. The first simulation study shows that sequential hidden asset value estimation using option price and equity price is more efficient and accurate than estimation using only equity price. The second simulation study shows that by applying the generalized Gibbs sampling method, model parameters can be successfully estimated under the model setting that there is no closed-form solution. In an empirical analysis using eight companies, half of which are DowJones30 companies and the other half non-Dow Jones 30 companies, the stock market noise for the firms with more liquid stock is estimated as having smaller volatility in market noise processes. In our second essay, the frailty idea described in Duffie, Eckner, Horel, and Saita (2009) is expanded to industry-specific terms. The MCEM algorithm is used to estimate parameters and random effect processes under the condition of unknown hidden paths and analytically-difficult likelihood functions. The estimate used in the study are based on U.S. public firms between 1990 and 2008. By introducing industry-specific hidden factors and assuming that they are random effects, a comparison is made of the relative scale of within- and between-industries correlations. A comparison study is also developed among a without-hidden-factor model, a common-hiddenfactor model, and our industry-specific common-factor model. The empirical results show that an industry-specific common factor is necessary for adjusting over- or under-estimation of default probabilities and over- or under-estimation of observed common factor effects. Our third essay combines and extends works of the first two essays by proposing a common model frame for both structural and intensity credit risk models. The common model frame combines the merits of several default correlation studies which are independently developed under each model setting. Following the work of Duffie, Eckner, Horel, and Saita (2009), we apply not only observed common factors, but also un-observed hidden factor to explain the correlated defaults. Bayesian techniques are used for estimation and generalized Gibbs sampling and Metropolis-Hasting (MH) algorithms are developed. More than a simple combination of two model approaches (structural and intensity models), we relax the assumptions of equal factor effect across entire firms in previous studies, instead adopting a random coefficients model. Also, a novelty of the approach lies in the fact that CDS and equity prices are used together for estimation. A simulation study shows that the posterior convergence is improved by adding CDS prices in estimation. Empirical results based on daily data of 125 companies comprising CDS.NA.IG13 in 2009 supports the necessity of such relaxations of assumption in previous studies. In order to demonstrate potential practical applications of the proposed framework, we derive the posterior distribution of CDX tranche prices. Our correlated structural model is successfully able to predict all the CDX tranche prices, but our correlated intensity model results suggests the need for further modification of the model. / Statistics
46

Deriving Consensus Ratings of the Big Three Rating Agencies

Grün, Bettina, Hofmarcher, Paul, Hornik, Kurt, Leitner, Christoph, Pichler, Stefan January 2010 (has links) (PDF)
This paper introduces a model framework for dynamic credit rating processes. Our framework aggregates ordinal rating information stemming from a variety of rating sources. The dynamic of the consensus rating captures systematic as well as idiosyncratic changes. In addition, our framework allows to validate the different rating sources by analyzing the mean/variance structure of the rating errors. In an empirical study for the iTraxx Europe companies rated by the big three external rating agencies we use Bayesian techniques to estimate the consensus ratings for these companies. The advantages are illustrated by comparing our dynamic rating model to a benchmark model. (author´s abstract) / Series: Research Report Series / Department of Statistics and Mathematics
47

Does Elderly Employment have an Impact on Youth Employment? A General Equilibrium Approach

Stiassny, Alfred, Uhl, Christina 07 1900 (has links) (PDF)
Does an increase of elderly employment cause a decline in youth employment? A simplified view of a demand driven economy would give a positive answer to this question. Econometric studies based on a single equation approach deliver little support for this belief. However, these studies typically suffer from identification problems to which no attention is paid in most cases. We therefore use a general equilibrium framework when trying to quantify these effects. Using yearly and quarterly Austrian labor and gdp data, we estimate two model variants by Bayesian methods: a) a standard equilibrium model where the degree of complementarity between old, young and primary labor is crucial for the sign and strength of the relevant effects and b) a simple, solely demand driven model which always leads to a crowding out of young through an increase in employment of the old. It turned out that the demand driven model is inferior in fitting the data compared to the standard model. Further, the degree of complementarity is estimated to be strong enough to lead to a small positive effect of elderly employment on youth employment. (authors' abstract) / Series: Department of Economics Working Paper Series
48

Analyse de mélanges à partir de signaux de chromatographie gazeuse / Analysis of gaseous mixture from gas chromatography signal.

Bertholon, Francois 23 September 2016 (has links)
Les dispositifs dédiés à l’analyse de gaz ont de nombreuses applications, notamment le contrôle de la qualité de l’air et des gaz naturels ou industriels, ou l’étude des gaz respiratoires. Le LETI travaille en collaboration avec la société APIX sur une nouvelle génération de dispositifs microsystèmes combinant des étapes de préparation des échantillons, de séparation des composants gazeux par des dispositifs de microchromatographie intégrés sur silicium, et de transduction par des détecteurs NEMS (Nano Electro Mechanical Systems) à base de nanocantilevers utilisés comme des détecteurs dits gravimétriques. Ces capteurs NEMS sont constitués de nano poutres vibrantes dont la fréquence de résonance dépend de la masse de matière déposée sur la poutre. Ces poutres sont fonctionnalisées avec un matériau capable d’adsorber puis de désorber certains composants ciblés. Lors du passage d’une impulsion de matière dans la colonne chromatographique au niveau d’un NEMS, le signal, défini par sa fréquence de résonance instantanée en fonction du temps, varie. Le pic observé sur ce signal traduira le pic de matière dans la colonne et permettra d’estimer la concentration du composant. Partant de l’ensemble des signaux mesurés sur l’ensemble des capteurs, l’objectif du traitement du signal est de fournir le profil moléculaire, c’est-à-dire la concentration de chaque composant, et d’estimer le pouvoir calorifique supérieur (PCS) associé. Le défi est d’associer une haute sensibilité pour détecter de très petites quantités de composants, et des capacités de séparation efficaces pour s’affranchir de la complexité du mélange et identifier la signature des molécules ciblées. L’objectif de cette thèse en traitement du signal est de travailler sur la formalisation des problèmes d’analyse des signaux abordés et d’étendre notre méthodologie d’analyse reposant sur l’approche problème inverse associée à un modèle hiérarchique de la chaîne d’analyse, aux dispositifs de microchromatographie gazeuse intégrant des capteurs NEMS multiples. La première application visée est le suivi du pouvoir calorifique d’un gaz naturel. Les ruptures concernent notamment la décomposition de signaux NEMS, la fusion multi-capteurs, l’autocalibrage et le suivi dynamique temporel. Le travail comportera des phases d’expérimentation réalisées notamment sur le banc d’analyse de gaz du laboratoire commun APIX-LETI. / The chromatography is a chemical technique to separate entities from a mixture. In this thesis, we will focused on the gaseous mixture, and particularly on the signal processing of gas chromatography. To acquire those signal we use different sensors. However whatever the sensor used, we obtain a peaks succession, where each peak corresponds to an entity present in the mixture. Our aim is then to analyze gaseous mixtures from acquired signals, by characterizing each peaks. After a bibliographic survey of the chromatography, we chose the Giddings and Eyring distribution to describe a peak shape. This distribution define the probability that a molecule walking randomly through the chromatographic column go out at a given time. Then we propose analytical model of the chromatographic signal which corresponds to a peak shape mixture model. Also in first approximation, this model is considered as Gaussian mixture model. To process those signals, we studied two broad groups of methods, executed upon real data and simulated data. The first family of algorithms consists in a Bayesian estimation of unknown parameters of our model. The order of mixture model can be include in the unknown parameters. It corresponds also to the number of entity in the gaseous mixture. To estimate those parameters, we use a Gibbs sampler, and Markov Chain Monte Carlo sampling, or a variational approach. The second methods consists in a sparse representation of the signal upon a dictionary. This dictionary includes a large set of peak shapes. The sparsity is then characterized by the number of components of the dictionary needed to describe the signal. At last we propose a sparse Bayesian method.
49

Informative Prior Distributions in Multilevel/Hierarchical Linear Growth Models: Demonstrating the Use of Bayesian Updating for Fixed Effects

Schaper, Andrew 29 September 2014 (has links)
This study demonstrates a fully Bayesian approach to multilevel/hierarchical linear growth modeling using freely available software. Further, the study incorporates informative prior distributions for fixed effect estimates using an objective approach. The objective approach uses previous sample results to form prior distributions included in subsequent samples analyses, a process referred to as Bayesian updating. Further, a method for model checking is outlined based on fit indices including information criteria (i.e., Akaike information criterion, Bayesian information criterion, and deviance information criterion) and approximate Bayes factor calculations. For this demonstration, five distinct samples of schools in the process of implementing School-Wide Positive Behavior Interventions and Supports (SWPBIS) collected from 2008 to 2013 were used with the unit of analysis being the school. First, the within-year SWPBIS fidelity growth was modeled as a function of time measured in months from initial measurement occasion. Uninformative priors were used to estimate growth parameters for the 2008-09 sample, and both uninformative and informative priors based on previous years' samples were used to model data from the 2009-10, 2010-11, 2011-12, 2012-13 samples. Bayesian estimates were also compared to maximum likelihood (ML) estimates, and reliability information is provided. Second, an additional three examples demonstrated how to include predictors into the growth model with demonstrations for: (a) the inclusion of one school-level predictor (years implementing) of SWPBIS fidelity growth, (b) several school-level predictors (relative socio-economic status, size, and geographic location), and (c) school and district predictors (sustainability factors hypothesized to be related to implementation processes) in a three-level growth model. Interestingly, Bayesian models estimated with informative prior distributions in all cases resulted in more optimal fit indices than models estimated with uninformative prior distributions.
50

The Effects of News Shocks and Bounded Rationality on Macroeconomic Volatility

Dombeck, Brian 06 September 2017 (has links)
This dissertation studies the impact embedding boundedly rational agents in real business cycle-type news-shock models may have on a variety of model predictions, from simulated moments to structural parameter estimates. In particular, I analyze the qualitative and quantitative effects of assuming agents are boundedly rational in a class of DSGE models which attempt to explain the observed volatility and comovements in key aggregate measures of U.S. economic performance as the result of endogenous responses to information in the form of ``news shocks''. The first chapter explores the theoretical feasibility of relaxing the rational expectations hypothesis in a three-sector real business cycle (RBC) model which generates boom-bust cycles as a result of periods of optimism and pessimism on the part of households. The second chapter determines whether agents forming linear forecasts of shadow prices in a nonlinear framework can lead to behavior approximately consistent with fully informed individuals in a one-sector real business cycle model. The third chapter analyzes whether empirical estimates of the relative importance of anticipated shocks may be biased by assuming rational expectations. By merging the two hitherto separate but complementary strands of literature related to bounded rationality and news shocks I am able to conduct in-depth analysis of the importance of both the information agents have and what they choose to do with it. At its core, the study of news in macroeconomics is a study of the specific role alternative information sets play in generating macroeconomic volatility. Adaptive learning on the other hand is concerned with the behavior of agents given an information set. Taken together, these fields jointly describe the input and the ``black box'' which produce model predictions from DSGE models. While previous research has been conducted on the effects of bounded rationality or news shocks in isolation, this dissertation marks the first set of research explicitly focused on the interaction of these two model features.

Page generated in 0.1069 seconds