Spelling suggestions: "subject:"[een] T-VAR"" "subject:"[enn] T-VAR""
1 |
Confronting Theory with Data: the Case of DSGE ModelingPoudyal, Niraj 07 December 2012 (has links)
The primary objective of this is to confront the DSGE model (Ireland, 2011) with data in an attempt to evaluate its empirical adequacy. The perspective used for this evaluation is based on unveiling the statistical model (structural VAR) behind the DSGE model, with a view to test its probabilistic assumptions vis-a-vis the data. It is shown that the implicit statistical model is seriously misspecified and the information from mis-specification (M-S) testing is then used to respecify the original structural VAR in an attempt to achieve statistical adequacy. The latter provides a precondition for the reliability of any inference based on the statistical model. Once the statistical adequacy of the respecified model is secured through thorough M-S testing, inferences like the likelihood-ratio test for the overidentifying restrictions, forecasting, impulse response analysis are applied to the original DSGE model to evaluate its empirical adequacy. At the end, the same inferential procedure is applied to the CAPM model. / Ph. D.
|
2 |
Essays on DSGE Models and Bayesian EstimationKim, Jae-yoon 11 June 2018 (has links)
This thesis explores the theory and practice of sovereignty. I begin with a conceptual analysis of sovereignty, examining its theological roots in contrast with its later influence in contestations over political authority. Theological debates surrounding God’s sovereignty dealt not with the question of legitimacy, which would become important for political sovereignty, but instead with the limits of his ability. Read as an ontological capacity, sovereignty is coterminous with an existent’s activity in the world. As lived, this capacity is regularly limited by the ways in which space is produced via its representations, its symbols, and its practices. All collective appropriations of space have a nomos that characterizes their practice. Foucault’s account of “biopolitics” provides an account of how contemporary materiality is distributed, an account that can be supplemented by sociological typologies of how city space is typically produced. The collective biopolitical distribution of space expands the range of practices that representationally legibilize activity in the world, thereby expanding the conceptual limits of existents and what it means for them to act up to the borders of their capacity, i.e., to practice sovereignty. The desire for total authorial capacity expresses itself in relations of domination and subordination that never erase the fundamental precarity of subjects, even as these expressions seek to disguise it. I conclude with a close reading of narratives recounting the lives of residents in Chicago’s Englewood, reading their activity as practices of sovereignty which manifest variously as they master and produce space. / Ph. D. / For an empirical analysis the statistical model implied in the theoretical model is crucial. The statistical model is simply the set of probabilistic assumptions imposed on the data, and invalid probabilistic assumptions undermines the reliability of statistical inference, rendering the empirical analysis untrustworthy. Hence, for securing trustworthy evidence one should always validate the implicit statistical model before drawing any empirical result from a theoretical model. This perspective is used to shed light on a widely used category of macroeconometric models known as Dynamic Stochastic General Equilibrium (DSGE) Models. Using U.S. time-series data, the paper demonstrates that a widely used econometric model for the U.S. economy is severely statistically misspecified; almost all of its probabilistic assumptions are invalid for the data. The paper proceeds to respecify the implicit statistical model behind the theoretical model with a view to secure its statistical adequacy (validity of its probabilistic assumptions). Using the respecified statistical model, the paper calls into question the literature evaluating the theoretical adequacy of current DSGE models, ignoring the fact that such evaluations are untrustworthy because they are based on statistically unreliable procedures.
|
3 |
[en] THE USE OF T-VAR MODELS ON THE EVALUATION OF THE CREDIBILITY IMPACT ON MONETARY POLICY / [pt] UTILIZAÇÃO DE MODELOS T-VAR NA AVALIAÇÃO DO IMPACTO DA CREDIBILIDADE NA POLÍTICA MONETÁRIABERNARDO ARAUJO DE LUCENA 17 April 2019 (has links)
[pt] O trabalho conduz uma análise empírica sobre como a credibilidade da política monetária influencia as reações da atividade econômica e da inflação a um aumento da taxa de juros. Como parte da metodologia, um índice de credibilidade foi construído a partir do desvio da expectativa de inflação com relação à meta para o período. Esse índice foi empregado em um modelo T-VAR junto a séries de hiato do produto, núcleo da inflação, taxa de câmbio e taxa de juros para gerar funções resposta ao impulso generalizadas através das quais foi possível comparar a dinâmica entre os regimes com credibilidade e sem credibilidade. As principais conclusões foram que a presença de credibilidade reduz o custo do aumento aos juros e intensifica a redução da inflação a um aumento da taxa de juros. / [en] The work conducts an empirical assessment on how monetary policy credibility influences the reaction of output and inflation to an increase in interest rate. As part of the methodology, a credibility index was built based on the deviation of the expected inflation from its target for a given period. This index was employed in a T-VAR model – also containing series of output gap, inflation core, exchange rate and interest rate – in order to generate generalized impulse response functions through which the dynamics of the with- and without- credibility regimes could be compared. The main conclusions were that the presence of credibility lowers the cost of raising interest rates and strengthens the inflation reduction in response to an interest rate increase.
|
4 |
Non-parametric inference of risk measuresAhn, Jae Youn 01 May 2012 (has links)
Responding to the changes in the insurance environment of the past decade, insurance regulators globally have been revamping the valuation and capital regulations. This thesis is concerned with the design and analysis of statistical inference procedures that are used to implement these new and upcoming insurance regulations, and their analysis in a more general setting toward lending further insights into their performance in practical situations. The quantitative measure of risk that is used in these new and upcoming regulations is the risk measure known as the Tail Value-at-Risk (T-VaR). In implementing these regulations, insurance companies often have to estimate the T-VaR of product portfolios from the output of a simulation of its cash flows. The distributions for the underlying economic variables are either estimated or prescribed by regulations. In this situation the computational complexity of estimating the T-VaR arises due to the complexity in determining the portfolio cash flows for a given realization of economic variables. A technique that has proved promising in such settings is that of importance sampling. While the asymptotic behavior of the natural non-parametric estimator of T-VaR under importance sampling has been conjectured, the literature has lacked an honest result. The main goal of the first part of the thesis is to give a precise weak convergence result describing the asymptotic behavior of this estimator under importance sampling. Our method also establishes such a result for the natural non-parametric estimator for the Value-at-Risk, another popular risk measure, under weaker assumptions than those used in the literature. We also report on a simulation study conducted to examine the quality of these asymptotic approximations in small samples.
The Haezendonck-Goovaerts class of risk measures corresponds to a premium principle that is a multiplicative analog of the zero utility principle, and is thus of significant academic interest. From a practical point of view our interest in this class of risk measures arose primarily from the fact that the T-VaR is, in a sense, a minimal member of the class. Hence, a study of the natural non-parametric estimator for these risk measures will lend further insights into the statistical inference for the T-VaR. Analysis of the asymptotic behavior of the generalized estimator has proved elusive, largely due to the fact that, unlike the T-VaR, it lacks a closed form expression. Our main goal in the second part of this thesis is to study the asymptotic behavior of this estimator. In order to conduct a simulation study, we needed an efficient algorithm to compute the Haezendonck-Goovaerts risk measure with precise error bounds. The lack of such an algorithm has clearly been noticed in the literature, and has impeded the quality of simulation results. In this part we also design and analyze an algorithm for computing these risk measures. In the process of doing we also derive some fundamental bounds on the solutions to the optimization problem underlying these risk measures. We also have implemented our algorithm on the R software environment, and included its source code in the Appendix.
|
Page generated in 0.0405 seconds