• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 7
  • 7
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Confronting Theory with Data: the Case of DSGE Modeling

Poudyal, Niraj 07 December 2012 (has links)
The primary objective of this is to confront the DSGE model (Ireland, 2011) with data in an attempt to evaluate its empirical adequacy. The perspective used for this evaluation is based on unveiling the statistical model (structural VAR) behind the DSGE model, with a view to test its probabilistic assumptions vis-a-vis the data. It is shown that the implicit statistical model is seriously misspecified and the information from mis-specification (M-S) testing is then used to respecify the original structural VAR in an attempt to achieve statistical adequacy. The latter provides a precondition for the reliability of any inference based on the statistical model. Once the statistical adequacy of the respecified model is secured through thorough M-S testing, inferences like the likelihood-ratio test for the overidentifying restrictions, forecasting, impulse response analysis are applied to the original DSGE model to evaluate its empirical adequacy. At the end, the same inferential procedure is applied to the CAPM model. / Ph. D.
2

Macroeconomic Forecasting: Statistically Adequate, Temporal Principal Components

Dorazio, Brian Arthur 05 June 2023 (has links)
The main goal of this dissertation is to expand upon the use of Principal Component Analysis (PCA) in macroeconomic forecasting, particularly in cases where traditional principal components fail to account for all of the systematic information making up common macroeconomic and financial indicators. At the outset, PCA is viewed as a statistical model derived from the reparameterization of the Multivariate Normal model in Spanos (1986). To motivate a PCA forecasting framework prioritizing sound model assumptions, it is demonstrated, through simulation experiments, that model mis-specification erodes reliability of inferences. The Vector Autoregressive (VAR) model at the center of these simulations allows for the Markov (temporal) dependence inherent in macroeconomic data and serves as the basis for extending conventional PCA. Stemming from the relationship between PCA and the VAR model, an operational out-of-sample forecasting methodology is prescribed incorporating statistically adequate, temporal principal components, i.e. principal components which capture not only Markov dependence, but all of the other, relevant information in the original series. The macroeconomic forecasts produced from applying this framework to several, common macroeconomic indicators are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons. / Doctor of Philosophy / The landscape of macroeconomic forecasting and nowcasting has shifted drastically in the advent of big data. Armed with significant growth in computational power and data collection resources, economists have augmented their arsenal of statistical tools to include those which can produce reliable results in big data environments. At the forefront of such tools is Principal Component Analysis (PCA), a method which reduces the number of predictors into a few factors containing the majority of the variation making up the original data series. This dissertation expands upon the use of PCA in the forecasting of key, macroeconomic indicators, particularly in instances where traditional principal components fail to account for all of the systematic information comprising the data. Ultimately, a forecasting methodology which incorporates temporal principal components, ones capable of capturing both time dependence as well as the other, relevant information in the original series, is established. In the final analysis, the methodology is applied to several, common macroeconomic and financial indicators. The forecasts produced using this framework are shown to outperform standard benchmarks in terms of predictive accuracy over longer forecasting horizons.
3

A Retrospective View of the Phillips Curve and Its Empirical Validity since the 1950s

Do, Hoang-Phuong 07 May 2021 (has links)
Since the 1960s, the Phillips curve has survived various significant changes (Kuhnian paradigm shifts) in macroeconomic theory and generated endless controversies. This dissertation revisits several important, representative papers throughout the curve's four historical, formative periods: Phillips' foundational paper in 1958, the wage determination literature in the 1960s, the expectations-augmented Phillips curve in the 1970s, and the latest New Keynesian iteration. The purpose is to provide a retrospective evaluation of the curve's empirical evidence. In each period, the preeminent role of the theoretical considerations over statistical learning from the data is first explored. To further appraise the trustworthiness of empirical evidence, a few key empirical models are then selected and evaluated for their statistical adequacy, which refers to the validity of the probabilistic assumptions comprising the statistical models. The evaluation results, using the historical (vintage) data in the first three periods and the modern data in the final one, show that nearly all of the models in the appraisal are misspecified - at least one probabilistic assumption is not valid. The statistically adequate models produced from the respecification with the same data suggest new understandings of the main variables' behaviors. The dissertations' findings from the representative papers cast doubt on the traditional narrative of the Phillips curve, which the representative papers play a crucial role in establishing. / Doctor of Philosophy / The empirical regularity of the Phillips curve, which captures the inverse relationship between the inflation and unemployment rates, has been widely debated in academic economic research and between policymakers in the last 60 years. To shed light on the debate, this dissertation examines a selected list of influential, representative studies from the Phillips curves' empirical history through its four formative periods. The examinations of these papers are conducted as a blend between a discussion on the methodology of econometrics (the primary quantitative method in economics), the role of theory vs. statistical learning from the observed data, and evaluations of the validity of the probabilistic assumptions assumed behind the empirical models. The main contention is that any departure of probabilistic assumptions produces unreliable statistical inference, rendering the empirical analysis untrustworthy. The evaluation results show that nearly all of the models in the appraisal are untrustworthy - at least one assumption is not valid. Then, an attempt to produce improved empirical models is made to produce new understandings. Overall, the dissertation's findings cast doubt on the traditional narrative of the Phillips curve, which the representative papers play a crucial role in establishing.
4

Essays on DSGE Models and Bayesian Estimation

Kim, Jae-yoon 11 June 2018 (has links)
This thesis explores the theory and practice of sovereignty. I begin with a conceptual analysis of sovereignty, examining its theological roots in contrast with its later influence in contestations over political authority. Theological debates surrounding God’s sovereignty dealt not with the question of legitimacy, which would become important for political sovereignty, but instead with the limits of his ability. Read as an ontological capacity, sovereignty is coterminous with an existent’s activity in the world. As lived, this capacity is regularly limited by the ways in which space is produced via its representations, its symbols, and its practices. All collective appropriations of space have a nomos that characterizes their practice. Foucault’s account of “biopolitics” provides an account of how contemporary materiality is distributed, an account that can be supplemented by sociological typologies of how city space is typically produced. The collective biopolitical distribution of space expands the range of practices that representationally legibilize activity in the world, thereby expanding the conceptual limits of existents and what it means for them to act up to the borders of their capacity, i.e., to practice sovereignty. The desire for total authorial capacity expresses itself in relations of domination and subordination that never erase the fundamental precarity of subjects, even as these expressions seek to disguise it. I conclude with a close reading of narratives recounting the lives of residents in Chicago’s Englewood, reading their activity as practices of sovereignty which manifest variously as they master and produce space. / Ph. D.
5

On modeling the volatility in speculative prices

Hou, Zhijie 12 June 2014 (has links)
Following the Probabilistic Reduction(PR) Approach, this paper proposes the Student’s Autoregressive (St-AR) Model, Student’s t Vector Autoregressive (St-VAR) Model and their heterogeneous versions, as an alternative to the various ARCH type models, to capture univariate and multivariate volatility. The St-AR and St-VAR models differ from the latter volatility models because they give rise to internally consistent statistical models that do not rely on ad-hoc specification and parameter restrictions, but model the conditional mean and conditional variance jointly. The univariate modeling is illustrated using the Real Effect Exchange Rate(REER) indices of three mainstream currencies in Asia (RMB, Hong Kong Dollar and Taiwan Dollar), while the multivariate volatility modeling is applied to investigate the relationship between the REER indices and stock price indices in mainland China, as well as the relationship between the stock prices in mainland China and Hong Kong. Following the PR methodology, the information gained in Mis-Specification(M-S) testing leads to respecification strategies from the original Normal-(V)AR models to the St-(V)AR models. The results from formal Mis-Specification (M-S) tests and forecasting performance indicate that the St-(V)AR models provide a more appropriate way to model volatility for certain types of speculative price data. / Ph. D.
6

Generalized Principal Component Analysis

Solat, Karo 05 June 2018 (has links)
The primary objective of this dissertation is to extend the classical Principal Components Analysis (PCA), aiming to reduce the dimensionality of a large number of Normal interrelated variables, in two directions. The first is to go beyond the static (contemporaneous or synchronous) covariance matrix among these interrelated variables to include certain forms of temporal (over time) dependence. The second direction takes the form of extending the PCA model beyond the Normal multivariate distribution to the Elliptically Symmetric family of distributions, which includes the Normal, the Student's t, the Laplace and the Pearson type II distributions as special cases. The result of these extensions is called the Generalized principal component analysis (GPCA). The GPCA is illustrated using both Monte Carlo simulations as well as an empirical study, in an attempt to demonstrate the enhanced reliability of these more general factor models in the context of out-of-sample forecasting. The empirical study examines the predictive capacity of the GPCA method in the context of Exchange Rate Forecasting, showing how the GPCA method dominates forecasts based on existing standard methods, including the random walk models, with or without including macroeconomic fundamentals. / Ph. D.
7

Revisiting the CAPM and the Fama-French Multi-Factor Models: Modeling Volatility Dynamics in Financial Markets

Michaelides, Michael 25 April 2017 (has links)
The primary objective of this dissertation is to revisit the CAPM and the Fama-French multi-factor models with a view to evaluate the validity of the probabilistic assumptions imposed (directly or indirectly) on the particular data used. By thoroughly testing the assumptions underlying these models, several departures are found and the original linear regression models are respecified. The respecification results in a family of heterogeneous Student's t models which are shown to account for all the statistical regularities in the data. This family of models provides an appropriate basis for revisiting the empirical adequacy of the CAPM and the Fama-French multi-factor models, as well as other models, such as alternative asset pricing models and risk evaluation models. Along the lines of providing a sound basis for reliable inference, the respecified models can serve as a coherent basis for selecting the relevant factors from the set of possible ones. The latter contributes to the enhancement of the substantive adequacy of the CAPM and the multi-factor models. / Ph. D.

Page generated in 0.3271 seconds