Spelling suggestions: "subject:"bayesian methods."" "subject:"eayesian methods.""
81 |
Tomography of the Earth by Geo-Neutrino Emission / Tomografia da Terra pela emissão de geo-neutrinosLeonardo Estêvão Schendes Tavares 05 August 2014 (has links)
Geo-neutrinos are electronic anti-neutrinos originated from the beta decay process of some few elements in the decay chains of $^Th$ and $^U$ present in Earth\'s interior. Recent experimental measurements of these particles have been generating great expectations towards a new way for investigating directly the interior of the planet. It is a new multidisciplinary area, which might in the near future bring considerable clues about Earth\'s thermal dynamics and formation processes. In this work, we construct an inferential model based on the multigrid priors method to deal, in a generic way, with the geo-neutrino source reconstruction problem. It is an inverse problem; given a region in space V and a finite and small number of measurements of the potential generated on the surface of V by some charge distribution $ho$, we try to infer $ho$. We present examples of applications and analysis of models in two and three dimensions and we also comment how other a priori information may be included. Furthermore, we indicate the steps for inferring the best locations for future detectors. The objective is to maximize the amount of information liable to be obtained from experimental measurements. We resort to an entropic method of inference which may be applied right after the results of the multigrid method are obtained. / Geo-neutrinos são neutrinos provindos do decaimento beta de alguns poucos elementos nas cadeias de $^$Th e $^$U presentes no interior da Terra. Recentes medidas experimentais dessas partículas têm proporcionado grandes expectativas como uma nova maneira de se investigar o interior do planeta diretamente. Trata-se de uma área multidisciplinar nova, que poderá no futuro próximo nos trazer grandes esclarecimentos sobre a dinâmica térmica e o processo de formação da Terra. Neste trabalho, construímos um modelo de inferência baseado no método de multigrid de priors para tratar, de modo genérico, o problema da reconstrução das fontes de geo-neutrinos no interior da Terra. Trata-se de um problema inverso; dada uma região do espaço V e um número finito e pequeno de medidas do potencial gerado na superfície de V por uma distribuição de carga $ho$, tentamos inferir $ho$. Apresentamos exemplos de aplicações e análises do método em modelos bidimensionais e tridimensionais e também comentamos como outras informações a priori podem ser incorporadas. Além disso, indicamos os passos para se inferir onde detectores futuros devem ser posicionados. O objetivo é maximizar a informação passível de ser obtida das medidas experimentais. Utilizamos um método baseado em inferência entrópica e que pode ser aplicado diretamente depois que os resultados do método de multigrid são obtidos.
|
82 |
Políticas monetária e fiscal ativas e passivas : uma análise para o Brasil pós-metas de inflaçãoNunes, André Francisco Nunes de January 2009 (has links)
A falta de coordenação das políticas fiscal e monetária no Brasil, freqüentemente, tem sido apontada como motivo para os desequilíbrios macroeconômicos que a economia enfrentou nas últimas três décadas. No período mais recente, pós-metas de inflação, diversos autores apontaram a política fiscal ativa como fator restritivo ao desempenho da política monetária. Nesse caso, a autoridade fiscal desconsidera a interferência do lado fiscal no monetário, o que implica em uma menor eficácia da política monetária. Somente a partir de uma condução de política fiscal com um comportamento passivo a política monetária poderia ser mais eficiente. Para testar a hipótese de políticas ativas e/ou passivas, estimou-se, por meio do método Bayesiano, um modelo DSGE com rigidez de preços e concorrência monopolística para a economia brasileira, baseado em Woodford (2003). Neste modelo, o superávit primário e a taxa de juros nominal são os instrumentos de política econômica. As estimações apontaram para um regime no qual as políticas fiscal e monetária foram ativas durante o período de 2000I a 2002IV. Porém, para o período posterior, de 2003I a 2008IV, a política fiscal foi passiva e a monetária ativa. / This paper seeks identify whether the way of fiscal and monetary macroeconomic policies in Brazil, to that period after inflation targets, were active way or/and passive way. For that, it’s estimated, for Bayesian methods, a model DSGE with price rigidities and monopolistic competition, in which the primary surplus and the nominal interest rates are the tools economic policy available. The lack of coordination of policies in Brazil, usually, has been identified as the reason for the macroeconomic imbalances. So, many authors pointed out the active fiscal policy, as a factor limiting the efficient performance of monetary policy. However, the analysis that relation within the framework of DSGE models is still limited, especially in applications for the Brazilian economy. The estimates of the model pointed out for a system where policies were active during the 2000/1Q to 2002/4Q both of them, and the later period, 2003/1Q – 2008/4Q, the fiscal policy behaved themselves on passive way and the monetary policy was active way.
|
83 |
Intermediação financeira e ciclos reais : uma abordagem DSGE para a economia brasileiraVega Filho, Julio Alberto Campa January 2014 (has links)
This paper seeks to present two Dynamic Stochastic General Equilibrium models – Curdia e Woodford (2009) e De Graeve (2007) – that allows identify mechanisms in which financial frictions can influence business cycles and domestic monetary policies. We extend the basic traditional New Keynesian model that considers the role of financial intermediation in the credit markets. Models in which a credit spreads is introduced allows for a time-varying wedge between the interest rate available to households on their savings and the interest rate at which it is possible to borrow These spreads are not constant over time, especially in periods of financial stress. Variations in the financial conditions, indicated by increases ou decreases in the size of credit spreads, implies consequences both for the equilibrium relation between the policy rate and aggregate expenditure and for the relation between real activity and inflation.
|
84 |
Le rôle des gaz à effet de serre dans les variations climatiques passées : une approche basée sur des chronologies précises des forages polaires profonds / The role of greenhouse gases in past climatic variations : an approach based on accurate chronologies of deep polar ice coresBeeman, Jai Chowdhry 21 October 2019 (has links)
Les forages polaires profonds contiennent des enregistrements des conditions climatiques du passé et de l'air piégé qui témoignent des compositions atmosphériques du passé, notamment des gaz à effet de serre. Cette archive nous permet de décrypter le rôle des gaz à effet de serre dans les variations climatiques pendant huit cycles glaciaire-interglaciaires, soit l'équivalent de plus de 800 000 ans. Les carottes de glace, comme toute archive paléoclimatique, sont caractérisées par des incertitudes liées aux processus qui traduisent les variables climatiques en proxy, ainsi que par des incertitudes dues aux chronologies de la glace et des bulles d'air piégées. Nous développons un cadre méthodologique, basé sur la modélisation inverse dite Bayesienne et l'évaluation de fonctions complexes de densité de probabilité, pour traiter les incertitudes liées aux enregistrements paléoclimatiques des carottes de glace de manière précise. Nous proposons deux études dans ce cadre. Pour la première étude, nous identifions les probabilités de localisation des points de changement de pente de l'enregistrement du CO2 dans la carotte de WAIS Divide et d'un stack d'enregistrements de paléotempérature a partir de cinq carottes Antarctiques avec des fonctions linéaires par morceaux. Nous identifions aussi les probabilités pour chaque enregistrement individuel de température. Cela nous permet d'examiner les changements de pente à l'échelle millénaire dans chacune des séries, et de calculer les déphasages entre les changements cohérents. Nous trouvons que le déphasage entre la température en Antarctique et le CO2 à probablement varié (en restant inferieur, generalement, à 500 ans) lors de la déglaciation. L'âge des changements de temperature varie probablement entre les sites de carottage aussi. Ce résultat indique que les mécanismes qui reliaient la température en Antarctique et le CO2 lors de la déglaciation pouvaient être differents temporellement et spatialement. Dans la deuxième étude nous développons une méthode Bayesienne pour la synchronisation des carottes de glace dans le modèle inverse chronologique IceChrono. Nos simulations indiquent que cette méthode est capable de synchroniser des séries de CH4 avec précision, tout en prenant en compte des observations chronologiques externes et de l'information à priori sur les caractéristiques glaciologiques aux sites de forage. La méthode est continue et objective, apportant de la précision à la synchronisation des carottes de glace. / Deep polar ice cores contain records of both past climate and trapped air that reflects past atmospheric compositions, notably of greenhouse gases. This record allows us to investigate the role of greenhouse gases in climate variations over eight glacial-interglacial cycles. The ice core record, like all paleoclimate records, contains uncertainties associated both with the relationships between proxies and climate variables, and with the chronologies of the records contained in the ice and trapped air bubbles. In this thesis, we develop a framework, based on Bayesian inverse modeling and the evaluation of complex probability densities, to accurately treat uncertainty in the ice core paleoclimate record. Using this framework, we develop two studies, the first about Antarctic Temperature and CO2 during the last deglaciation, and the second developing a Bayesian synchronization method for ice cores. In the first study, we use inverse modeling to identify the probabilities of piecewise linear fits to CO2 and a stack of Antarctic Temperature records from five ice cores, along with the individual temperature records from each core, over the last deglacial warming, known as Termination 1. Using the nodes, or change points in the piecewise linear fits accepted during the stochastic sampling of the posterior probability density, we discuss the timings of millenial-scale changes in trend in the series, and calculate the phasings between coherent changes. We find that the phasing between Antarctic Temperature and CO2 likely varied, though the response times remain within a range of ~500 years from synchrony, both between events during the deglaciation and accross the individual ice core records. This result indicates both regional-scale complexity and modulations or variations in the mechanisms linking Antarctic temperature and CO2 accross the deglaciation. In the second study, we develop a Bayesian method to synchronize ice cores using corresponding time series in the IceChrono inverse chronological model. Tests show that this method is able to accurately synchronize CH4 series, and is capable of including external chronological observations and prior information about the glaciological characteristics at the coring site. The method is continuous and objective, bringing a new degree of accuracy and precision to the use of synchronization in ice core chronologies.
|
85 |
Forecasting Electric Load Demand through Advanced Statistical TechniquesSilva, Jesús, Senior Naveda, Alexa, García Guliany, Jesús, Niebles Núẽz, William, Hernández Palma, Hugo 07 January 2020 (has links)
Traditional forecasting models have been widely used for decision-making in production, finance and energy. Such is the case of the ARIMA models, developed in the 1970s by George Box and Gwilym Jenkins [1], which incorporate characteristics of the past models of the same series, according to their autocorrelation. This work compares advanced statistical methods for determining the demand for electricity in Colombia, including the SARIMA, econometric and Bayesian methods.
|
86 |
Toward Error-Statistical Principles of Evidence in Statistical InferenceJinn, Nicole Mee-Hyaang 02 June 2014 (has links)
The context for this research is statistical inference, the process of making predictions or inferences about a population from observation and analyses of a sample. In this context, many researchers want to grasp what inferences can be made that are valid, in the sense of being able to uphold or justify by argument or evidence. Another pressing question among users of statistical methods is: how can spurious relationships be distinguished from genuine ones? Underlying both of these issues is the concept of evidence. In response to these (and similar) questions, two questions I work on in this essay are: (1) what is a genuine principle of evidence? and (2) do error probabilities have more than a long-run role? Concisely, I propose that felicitous genuine principles of evidence should provide concrete guidelines on precisely how to examine error probabilities, with respect to a test's aptitude for unmasking pertinent errors, which leads to establishing sound interpretations of results from statistical techniques. The starting point for my definition of genuine principles of evidence is Allan Birnbaum's confidence concept, an attempt to control misleading interpretations. However, Birnbaum's confidence concept is inadequate for interpreting statistical evidence, because using only pre-data error probabilities would not pick up on a test's ability to detect a discrepancy of interest (e.g., "even if the discrepancy exists" with respect to the actual outcome. Instead, I argue that Deborah Mayo's severity assessment is the most suitable characterization of evidence based on my definition of genuine principles of evidence. / Master of Arts
|
87 |
Investigating Molecular Evolution of Rhodopsin Using Likelihood/Bayesian Phylogenetic MethodsDu, Jingjing 22 July 2010 (has links)
Rhodopsin, a visual pigment protein found in retinal photoreceptors, mediates vision at low-light levels. Recent studies focusing primarily in human and mouse have challenged the assumption of neutral evolution of synonymous substitutions in mammals. Using recently developed likelihood-based codon models accounting for mutational bias and selection, we find significant evidence for selective constraint on synonymous substitutions in mammalian rhodopsins, and a preference for cytosine at 3rd codon positions. A second project investigated adaptive evolution in rhodopsin, in view of theories of nocturnality in early mammals. We detected a significant acceleration of non-synonymous substitution rates at the origins of therian mammals, and a tendency of synonymous substitutions towards C-ending codons prior to that. These findings suggest an evolutionary scenario in which synonymous substitutions that increase mRNA stability and/or translation efficiency may have preceded adaptive non-synonymous evolution in early mammalian rhodopsins. These findings have important implications for theories of early mammalian nocturnality.
|
88 |
Approximation de la distribution a posteriori d'un modèle Gamma-Poisson hiérarchique à effets mixtesNembot Simo, Annick Joëlle 01 1900 (has links)
La méthode que nous présentons pour modéliser des données dites de "comptage" ou données de Poisson est basée sur la
procédure nommée Modélisation multi-niveau et interactive de la régression de Poisson (PRIMM) développée par Christiansen
et Morris (1997). Dans la méthode PRIMM, la régression de Poisson ne comprend que des effets fixes tandis que notre modèle
intègre en plus des effets aléatoires. De même que Christiansen et Morris (1997), le modèle étudié consiste à faire de l'inférence basée sur des approximations analytiques des distributions a posteriori des paramètres, évitant ainsi d'utiliser des méthodes computationnelles comme les méthodes de Monte Carlo par chaînes de Markov (MCMC). Les approximations sont basées sur la méthode de Laplace et la théorie asymptotique liée à l'approximation normale pour les lois a posteriori. L'estimation des paramètres de la régression de Poisson est faite par la maximisation de leur densité a posteriori via l'algorithme de Newton-Raphson. Cette étude détermine également les deux premiers moments a posteriori des paramètres de la loi de Poisson dont la distribution a posteriori de chacun d'eux est approximativement une loi gamma. Des applications sur deux exemples de données ont permis de vérifier que ce modèle peut être considéré dans une certaine mesure comme une généralisation de la méthode PRIMM. En effet, le modèle s'applique aussi bien aux données de Poisson non stratifiées qu'aux données stratifiées; et dans ce dernier cas, il comporte non seulement des effets fixes mais aussi des effets aléatoires liés aux strates. Enfin, le modèle est appliqué aux données relatives à plusieurs types d'effets indésirables observés chez les participants d'un essai clinique impliquant un vaccin quadrivalent contre la rougeole, les oreillons, la rub\'eole et la varicelle. La régression de Poisson comprend l'effet fixe correspondant à la variable traitement/contrôle, ainsi que des effets aléatoires liés aux systèmes biologiques du corps humain auxquels sont attribués les effets indésirables considérés. / We propose a method for analysing count or Poisson data based on the procedure called Poisson Regression Interactive Multilevel Modeling (PRIMM) introduced by Christiansen and Morris (1997). The Poisson regression in the PRIMM method has fixed effects only, whereas our model incorporates random effects. As well as Christiansen and Morris (1997), the model studied aims at doing inference based on adequate analytical approximations of posterior distributions of the parameters. This avoids the use of computationally expensive methods such as Markov chain Monte Carlo (MCMC) methods. The approximations are based on the Laplace's method and asymptotic theory. Estimates of Poisson mixed effects regression parameters are obtained through the maximization of their joint posterior density via the Newton-Raphson algorithm. This study also provides the first two posterior moments of the Poisson parameters involved. The posterior distributon of these parameters is approximated by a gamma distribution. Applications to two datasets show that our model can be somehow considered as a generalization of the PRIMM method since it also allows clustered count data. Finally, the model is applied to data involving many types of adverse events recorded by the participants of a drug clinical trial which involved a quadrivalent vaccine containing measles, mumps, rubella and varicella. The Poisson regression incorporates the fixed effect corresponding to the covariate treatment/control as well as a random effect associated with the biological system of the body affected by the
adverse events.
|
89 |
Essays on asset pricing and the macroeconomyKliem, Martin 02 September 2009 (has links)
Diese Dissertation beinhaltet drei eigenständige Aufsätze, die die Interaktionen von Bewertungsmodellen für Wertpapiere, Finanzmärkten und der Volkswirtschaft untersuchen. Alle drei Papiere tragen zu einem besseren Verständnis von Verknüpfungen zwischen Finanzmärkten und Realwirtschaft. Im Mittelpunkt dieser Arbeit stehen Gewohnheitspräferenzen und Bayesianische Schätzmethoden, um sowohl theoretische als auch empirische Erkenntnisse zu liefern, die helfen, die makroökonomische und die Finanzliteratur stärker zu verbinden. Das erste Essay beschäftigt sich mit Gewohnheitspräferenzen und deren Fähigkeit, verschiedene Aktienrenditen in einem Portfolio zu erklären. Die zugrunde gelegten konsumbasierten Bewertungsmodelle basieren auf mikrofundierten Präferenzen und implizieren somit individuelles und aggregiertes Verhalten von Individuen. Aus diesem Grund werden Bayesianische Methoden genutzt, um diese a priori Information in die Schätzung einfließen zu lassen. Im zweiten Essay, einer gemeinsamen Arbeit mit Harald Uhlig, schätzen wir ein DSGE-Modell. Hervorzuheben ist, dass wir sowohl die Momente zweiter Ordnung für Wertpapierrenditen berücksichtigen als auch die a priori Wahrscheinlichkeiten für stilisierte Fakten wie Frisch-Elastizität und Sharpe ratio. Dieses Vorgehen liefert eine Modellschätzung, die gleichzeitig Fakten der Konjunkturzyklen, Momente zweiter Ordnung von Wertpapierrenditen sowie Finanzmarktfakten besser erklären kann. Das dritte Essay präsentiert ein DSGE-Modell, das die Interaktionen der Aktienmarktbooms zum Ende der 1980er und 1990er Jahre mit der Realwirtschaft erklären kann. Mit Hilfe nichtseparabler Präferenzen und nominaler Rigiditäten lässt sich der simultane Anstieg von BIP, Konsum, Investitionen, geleisteten Arbeitsstunden und Löhnen in dieser Zeit erklären. Abschließend wird die Rolle der Geldpolitik während Aktienmarktbooms diskutiert, und es werden optimale geldpolitische Regeln hergeleitet. / This thesis consists of three self-contained essays that investigate the interaction of asset prices and financial markets with the macroeconomy. All papers extend the existing literature in order to enhance the understanding of the strong degree of cross-linking between financial markets and the ‘rest of the economy’. In particular, the thesis focuses on habitually formed preferences and Bayesian techniques to yield theoretical and empirical insights, which help to reduce the existing gap between asset pricing and macroeconomic literature. The first essay examines and compares the ability of habitually formed preferences to explain the cross section of asset returns compared to successful factor models. Such consumption-based asset pricing models are based on micro- founded preferences, implying a linkage to individual and aggregate behavior. For this reason, the essay uses a Bayesian approach with a priori information derived from the empirical Business Cycle literature. In the second essay which is joint work with Harald Uhlig, we use Bayesian techniques to estimate a DSGE model. Especially, we explore a way to include conditional second moments of asset returns into the estimation. Moreover, we constrain the estimation by a priori probabilities on the Sharpe ratio and the Frisch elasticity. By doing so, the estimated model can well jointly explain key business cycle facts, different volatilities of several asset returns, and the empirically observed equity premium. The third essay presents a DSGE model, which covers the observed co-movements of stock market boom and bust episodes in the 1980''s and 1990''s and the economy. By including non-separable preferences and nominal rigidities, the model explains the simultaneous rise of consumption, output, investments, hours worked, and wages during a boom and the subsequent bust. Finally, the role of monetary policy during stock market booms is discussed, and optimal monetary policy rules are evaluated.
|
90 |
Um modelo espaço-temporal contínuo para o preço de lançamentos imobiliários na cidade de São Paulo / A continuous space-time model for the price of real estate launches in the city of São PauloRocio, Vitor Dias 15 June 2018 (has links)
Neste trabalho será feito um modelo espaço-temporal contínuo para preços de imóveis na cidade de São Paulo estimado através de métodos Bayesianos. Faremos uma decomposição da série em tendência e ciclo além de incorporar um conjunto de variáveis explicativas e efeitos aleatórios espaciais projetados no contínuo. Este modelo introduz um novo método para analisar a formação dos preços dos lançamentos imobiliários. Consideramos em nosso modelo hedônico, além das características intrínsecas, também as características da vizinhança e o ambiente econômico. Com este modelo, conseguimos observar os preços de equilíbrio para as respectivas localizações e uma interpretação mais clara da dinâmica de preços dos imóveis entre janeiro de 2000 e dezembro de 2013 para a cidade de São Paulo. / In this work will be made a continuous spatial-temporal model for real estate prices in the city of São Paulo estimated using Bayesian methods. We will decompose the series into a trend and cycle, and incorporate a set of explanatory variables and random spatial effects projected into the continuum. This model introduces a new method to analyze the price formation of real estate launches. We consider in our hedonic model, besides the intrinsic characteristics, also the characteristics of the neighborhood and the economic environment. With this model, we were able to observe the equilibrium prices for the respective locations and a clearer interpretation of the dynamics of real estate prices between January 2000 and December 2013 for the city of São Paulo.
|
Page generated in 0.0816 seconds