• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • Tagged with
  • 9
  • 9
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Two Essays on Resource Economics: A Study of the Statistical Evidence for Global Warming and An Analysis of Overcompliance with Effluent Standards Among Wastewater Treatment Plants

Akobundu, Eberechukwu 02 December 2004 (has links)
These papers analyze two issues in resource economics that are currently debated in academic and policy arenas: global warming and overcompliant behavior amongst regulated sources of water pollution. The first paper examines the evidence for global warming in particular, the published estimates of the rate of global warming. The paper reproduces published results using the same data, provides evidence that the statistical model used to obtain these estimates is misspecified for the data, and re-specifies the model in order to obtain a statistically adequate model. The re-specified model indicates that trends in the surface temperature anomalies are highly nonlinear rather than linear and that currently published estimates of the degree of global warming are based on a misspecified model. It argues for caution in interpreting linear trend estimates and illustrates the importance of model misspecification testing and re-specification when modeling climate change using statistical models. The second paper examines recent evidence for overcompliant behavior amongst wastewater treatment plants whose pollutant discharges are regulated under the Clean Water Act. The historical evidence suggests that many regulated facilities do not comply with permit regulations. This behavior has been attributed to inadequate monitoring and enforcement by the regulatory agencies as well as to an institutional structure that penalizes noncompliance but that does not reward overcompliance. Against this backdrop, the evidence for significant and widespread overcompliance appears puzzling. The paper examines overcompliance with a widely- regulated pollutant, biochemical oxygen demand (BOD). The testable hypotheses are: whether jointness in pollution control between nitrogen and BOD can explain overcompliance and whether variation in BOD output can explain BOD overcompliance. These hypotheses are examined by developing a conceptual model of BOD overcompliance and estimating a model of BOD control. The results indicate that jointness in pollution control plays a significant role in explaining BOD overcompliance. Variation in BOD output is not a significant factor in explaining BOD overcompliance. The paper explores plausible reasons for this result and proposes significant modifications to the traditional marginal analysis of BOD overcompliance/compliance decisions. / Ph. D.
2

Confronting Theory with Data: the Case of DSGE Modeling

Poudyal, Niraj 07 December 2012 (has links)
The primary objective of this is to confront the DSGE model (Ireland, 2011) with data in an attempt to evaluate its empirical adequacy. The perspective used for this evaluation is based on unveiling the statistical model (structural VAR) behind the DSGE model, with a view to test its probabilistic assumptions vis-a-vis the data. It is shown that the implicit statistical model is seriously misspecified and the information from mis-specification (M-S) testing is then used to respecify the original structural VAR in an attempt to achieve statistical adequacy. The latter provides a precondition for the reliability of any inference based on the statistical model. Once the statistical adequacy of the respecified model is secured through thorough M-S testing, inferences like the likelihood-ratio test for the overidentifying restrictions, forecasting, impulse response analysis are applied to the original DSGE model to evaluate its empirical adequacy. At the end, the same inferential procedure is applied to the CAPM model. / Ph. D.
3

A Statistical Approach to Empirical Macroeconomic Modeling with Practical Applications

Edwards, Jeffrey A. 24 April 2003 (has links)
Most of empirical modeling involves the use of Ordinary Least Squares regression where the residuals are assumed normal, independent, and identically distributed. In finite samples, these assumptions becomes critical for accurate estimations, however, in macroeconomics in particular, these assumptions are rarely tested. This study addresses the applications of statistical testing methods and model respecification within the context of applied macroeconomics. The first application is a statistical comparison of Gregory Mankiw, David Romer and David Weil’s A Contribution to the Empirics of Economic Growth, and Nazrul Islam’s Growth Empirics: A Panel Data Approach. This analysis shows that the models in both papers are statistically misspecified. When respecified, the functional forms of Mankiw, Romer, and Weil’s models change considerably whereas Islam’s retain the theoretical structure. The second application is a study of the impact of inflation on investment and growth. After instrumenting for inflation with a set of political variables, I find that between approximately 1% and 9% inflation, there is a positive correlation between inflation and investment--the Mundell-Tobin effect may be a valid explanation. I further this analysis to show that treating investment as an exogenous variable may be problematic in empirical growth models. / Ph. D.
4

Essays on DSGE Models and Bayesian Estimation

Kim, Jae-yoon 11 June 2018 (has links)
This thesis explores the theory and practice of sovereignty. I begin with a conceptual analysis of sovereignty, examining its theological roots in contrast with its later influence in contestations over political authority. Theological debates surrounding God’s sovereignty dealt not with the question of legitimacy, which would become important for political sovereignty, but instead with the limits of his ability. Read as an ontological capacity, sovereignty is coterminous with an existent’s activity in the world. As lived, this capacity is regularly limited by the ways in which space is produced via its representations, its symbols, and its practices. All collective appropriations of space have a nomos that characterizes their practice. Foucault’s account of “biopolitics” provides an account of how contemporary materiality is distributed, an account that can be supplemented by sociological typologies of how city space is typically produced. The collective biopolitical distribution of space expands the range of practices that representationally legibilize activity in the world, thereby expanding the conceptual limits of existents and what it means for them to act up to the borders of their capacity, i.e., to practice sovereignty. The desire for total authorial capacity expresses itself in relations of domination and subordination that never erase the fundamental precarity of subjects, even as these expressions seek to disguise it. I conclude with a close reading of narratives recounting the lives of residents in Chicago’s Englewood, reading their activity as practices of sovereignty which manifest variously as they master and produce space. / Ph. D. / For an empirical analysis the statistical model implied in the theoretical model is crucial. The statistical model is simply the set of probabilistic assumptions imposed on the data, and invalid probabilistic assumptions undermines the reliability of statistical inference, rendering the empirical analysis untrustworthy. Hence, for securing trustworthy evidence one should always validate the implicit statistical model before drawing any empirical result from a theoretical model. This perspective is used to shed light on a widely used category of macroeconometric models known as Dynamic Stochastic General Equilibrium (DSGE) Models. Using U.S. time-series data, the paper demonstrates that a widely used econometric model for the U.S. economy is severely statistically misspecified; almost all of its probabilistic assumptions are invalid for the data. The paper proceeds to respecify the implicit statistical model behind the theoretical model with a view to secure its statistical adequacy (validity of its probabilistic assumptions). Using the respecified statistical model, the paper calls into question the literature evaluating the theoretical adequacy of current DSGE models, ignoring the fact that such evaluations are untrustworthy because they are based on statistically unreliable procedures.
5

Five contributions to econometric theory and the econometrics of ultra-high-frequency data

Meitz, Mika January 2006 (has links)
No description available.
6

Four Essays on Building Conditional Correlation GARCH Models.

Nakatani, Tomoaki January 2010 (has links)
This thesis consists of four research papers. The main focus is on building the multivariate Conditional Correlation (CC-) GARCH models. In particular, emphasis lies on considering an extension of CC-GARCH models that allow for interactions or causality in conditional variances. In the first three chapters, misspecification testing and parameter restrictions in these models are discussed. In the final chapter, a computer package for building major variants of the CC-GARCH models is presented. The first chapter contains a brief introduction to the CC-GARCH models as well as a summary of each research paper. The second chapter proposes a misspecification test for modelling of the conditional variance part of the Extended Constant CC-GARCH model. The test is designed for testing the hypothesis of no interactions in the conditional variances. If the null hypothesis is true, then the conditional variances may be described by the standard CCC-GARCH model. The test is constructed on the Lagrange Multiplier (LM) principle that only requires the estimation of the null model. Although the test is derived under the assumption of the constant conditional correlation, the simulation experiments suggest that the test is also applicable to building CC-GARCH models with changing conditional correlations. There is no asymptotic theory available for these models, which is why simulation of the test statistic in this situation has been necessary. The third chapter provides yet another misspecification test for modelling of the conditional variance component of the CC-GARCH models, whose parameters are often estimated in two steps. The estimator obtained through these two steps is a two-stage quasi-maximum likelihood estimator (2SQMLE). Taking advantage of the asymptotic results for 2SQMLE, the test considered in this chapter is formulated using the LM principle, which requires only the estimation of univariate GARCH models. It is also shown that the test statistic may be computed by using an auxiliary regression. A robust version of the new test is available through another auxiliary regression. All of this amounts to a substantial simplification in computations compared with the test proposed in the second chapter. The simulation experiments show that, under both under both Gaussian and leptokurtic innovations, as well as under changing conditional correlations, the new test has reasonable size and power properties. When modelling the conditional variance, it is necessary to keep the sequence of conditional covariance matrices positive definite almost surely for any time horizon. In the fourth chapter it is demonstrated that under certain conditions some of the parameters of the model can take negative values while the conditional covariance matrix remains positive definite almost surely. It is also shown that even in the simplest first-order vector GARCH representation, the relevant parameter space can contain negative values for some parameters, which is not possible in the univariate model. This finding makes it possible to incorporate negative volatility spillovers into the CC-GARCH framework. Many new GARCH models and misspecification testing procedures have been recently proposed in the literature. When it comes to applying these models or tests, however, there do not seem to exist many options for the users to choose from other than creating their own computer programmes. This is especially the case when one wants to apply a multivariate GARCH model. The last chapter of the thesis offers a remedy to this situation by providing a workable environment for building CC-GARCH models. The package is open source, freely available on the Internet, and designed for use in the open source statistical environment R. With this package can estimate major variants of CC-GARCH models as well as simulate data from the CC-GARCH data generating processes with multivariate normal or Student's t innovations. In addition, the package is equipped with the necessary functions for conducting diagnostic tests such as those discussed in the third chapter of this thesis. / <p>Diss. Stockholm : Handelshögskolan, 2010. Sammanfattning jämte 4 uppsatser.</p>
7

Four essays on the econometric modelling of volatility and durations

Amado, Cristina January 2009 (has links)
The thesis "Four Essays on the Econometric Modelling of Volatility and Durations" consists of four research papers in the area of financial econometrics on topics of the modelling of financial market volatility and the econometrics of ultra-high-frequency data. The aim of the thesis is to develop new econometric methods for modelling and hypothesis testing in these areas. The second chapter introduces a new model, the time-varying GARCH (TV-GARCH) model, in which volatility has a smooth time-varying structure of either additive or multiplicative type. To characterize smooth changes in the (un)conditional variance we assume that the parameters vary smoothly over time according to the logistic transition function. A data-based modelling technique is used for specifying the parametric structure of the TV-GARCH models. This is done by testing a sequence of hypotheses by Lagrange multiplier tests presented in the chapter. Misspecification tests are also provided for evaluating the adequacy of the estimated model. The third chapter addresses the issue of modelling deterministic changes in the unconditional variance over a long return series. The modelling strategy is illustrated with an application to the daily returns of the Dow Jones Industrial Average (DJIA) index from 1920 until 2003. The empirical results sustain the hypothesis that the assumption of constancy of the unconditional variance is not adequate over long return series and indicate that deterministic changes in the unconditional variance may be associated with macroeconomic factors. In the fourth chapter we propose an extension of the univariate multiplicative TV-GARCH model to the multivariate Conditional Correlation GARCH (CC-GARCH) framework. The variance equations are parameterized such that they combine the long-run and the short-run dynamic behaviour of the volatilities. In this framework, the long-run behaviour is described by the individual unconditional variances, and it is allowed to vary smoothly over time according to the logistic transition function. The effects of modelling the nonstationary variance component are examined empirically in several CC-GARCH models using pairs of seven daily stock return series from the S&amp;P 500 index. The results show that the magnitude of such effect varies across different stock series and depends on the structure of the conditional correlation matrix. An important feature of financial durations is the evidence of a strong diurnal variation over the trading day. In the fifth chapter we propose a new parameterization for describing the diurnal pattern of trading activity. The parametric structure of the diurnal component allows the duration process to change smoothly over the time-of-day according to the logistic transition function. The empirical results suggest that the diurnal variation may not always have the inverted U-shaped pattern for the trade durations as documented in earlier studies.
8

Revisiting the CAPM and the Fama-French Multi-Factor Models: Modeling Volatility Dynamics in Financial Markets

Michaelides, Michael 25 April 2017 (has links)
The primary objective of this dissertation is to revisit the CAPM and the Fama-French multi-factor models with a view to evaluate the validity of the probabilistic assumptions imposed (directly or indirectly) on the particular data used. By thoroughly testing the assumptions underlying these models, several departures are found and the original linear regression models are respecified. The respecification results in a family of heterogeneous Student's t models which are shown to account for all the statistical regularities in the data. This family of models provides an appropriate basis for revisiting the empirical adequacy of the CAPM and the Fama-French multi-factor models, as well as other models, such as alternative asset pricing models and risk evaluation models. Along the lines of providing a sound basis for reliable inference, the respecified models can serve as a coherent basis for selecting the relevant factors from the set of possible ones. The latter contributes to the enhancement of the substantive adequacy of the CAPM and the multi-factor models. / Ph. D. / The primary objective of this dissertation is to revisit the CAPM and the FamaFrench multi-factor models with a view to evaluate the validity of the probabilistic assumptions imposed (directly or indirectly) on the particular data used. By probing for potential departures from the Normality, Linearity, Homoskedasticity, Independence, and t-invariance assumptions, it is shown that the assumptions implicitly imposed on these empirical asset pricing models are inappropriate. In light of these results, the probabilistic assumptions underlying the CAPM and the Fama-French multi-factor models are replaced with the Studentís t, Linearity, Heteroskedasticity, Markov Dependence, and t-heterogeneity assumptions. The new probabilistic structure results in a family of heterogeneous Studentís t models which are shown to account for all the statistical regularities in the data. This family of models provides an appropriate basis for revisiting the empirical adequacy of the CAPM and the Fama-French multifactor models, as well as other models, such as alternative asset pricing models and risk evaluation models. Along the lines of providing a sound basis for reliable statistical inference results, the proposed models can serve as a coherent basis for selecting the potential sources of risk from a set of possible ones. The latter contributes to the enhancement of the substantive adequacy of the CAPM and the multi-factor models.
9

Naturalism & Objectivity: Methods and Meta-methods

Miller, Jean Anne 19 August 2011 (has links)
The error statistical account provides a basic account of evidence and inference. Formally, the approach is a re-interpretation of standard frequentist (Fisherian, Neyman-Pearson) statistics. Informally, it gives an account of inductive inference based on arguing from error, an analog of frequentist statistics, which keeps the concept of error probabilities central to the evaluation of inferences and evidence. Error statistical work at present tends to remain distinct from other approaches of naturalism and social epistemology in philosophy of science and, more generally, Science and Technology Studies (STS). My goal is to employ the error statistical program in order to address a number of problems to approaches in philosophy of science, which fall under two broad headings: (1) naturalistic philosophy of science and (2) social epistemology. The naturalistic approaches that I am interested in looking at seek to provide us with an account of scientific and meta-scientific methodologies that will avoid extreme skepticism, relativism and subjectivity and claim to teach us something about scientific inferences and evidence produced by experiments (broadly construed). I argue that these accounts fail to identify a satisfactory program for achieving those goals and; moreover, to the extent that they succeed it is by latching on to the more general principles and arguments from error statistics. In sum, I will apply the basic ideas from error statistics and use them to examine (and improve upon) an area to which they have not yet been applied, namely in assessing and pushing forward these interdisciplinary pursuits involving naturalistic philosophies of science that appeal to cognitive science, psychology, the scientific record and a variety of social epistemologies. / Ph. D.

Page generated in 0.1327 seconds