Spelling suggestions: "subject:"econometrics"" "subject:"sconometrics""
61 |
Time series analysis of some economic and ecological data.January 1984 (has links)
by Man Ka Sing. / Bibliography: leaves 69-70 / Thesis (M.Ph.)--Chinese University of Hong Kong, 1984
|
62 |
A quarterly macroeconometric model of the Hong Kong economy.January 1982 (has links)
by Yik Yiu Man. / Bibliography : leaves 133-137 / Thesis (M.Phil.)--Chinese University of Hong Kong, 1982
|
63 |
Essays on forecasting and volatility modellingDias, Gustavo Fruet January 2013 (has links)
This thesis contributes to four distinct fields on the econometrics literature: forecasting macroeconomic variables using large datasets, volatility modelling, risk premium estimation and iterative estimators. As a research output, this thesis presents a balance of applied econometrics and econometric theory, with the latter one covering the asymptotic theory of iterative estimators under different models and mapping specifications. In Chapter 1 we introduce and motivate the estimation tools for large datasets, the volatility modelling and the use of iterative estimators. In Chapter 2, we address the issue of forecasting macroeconomic variables using medium and large datasets, by adopting vector autoregressive moving average (VARMA) models. We overcome the estimation issue that arises with this class of models by implementing the iterative ordinary least squares (IOLS) estimator. We establish the consistency and asymptotic distribution considering the ARMA(1,1) and we argue these results can be extended to the multivariate case. Monte Carlo results show that IOLS is consistent and feasible for large systems, and outperforms the maximum likelihood (MLE) estimator when sample size is small. Our empirical application shows that VARMA models outperform the AR(1) (autoregressive of order one model) and vector autoregressive (VAR) models, considering different model dimensions. Chapter 3 proposes a new robust estimator for GARCH-type models: the nonlinear iterative least squares (NL-ILS). This estimator is especially useful on specifications where errors have some degree of dependence over time or when the conditional variance is misspecified. We illustrate the NL-ILS estimator by providing algorithms that consider the GARCH(1,1), weak-GARCH(1,1), GARCH(1,1)-in-mean and RealGARCH(1,1)-in-mean models. I establish the consistency and asymptotic distribution of the NLILS estimator, in the case of the GARCH(1,1) model under assumptions that are compatible with the quasi-maximum likelihood (QMLE) estimator. The consistency result is extended to the weak-GARCH(1,1) model and a further extension of the asymptotic results to the GARCH(1,1)-inmean case is also discussed. A Monte Carlo study provides evidences that the NL-ILS estimator is consistent and outperforms the MLE benchmark in a variety of specifications. Moreover, when the conditional variance is misspecified, the MLE estimator delivers biased estimates of the parameters in the mean equation, whereas the NL-ILS estimator does not. The empirical application investigates the risk premium on the CRSP, S&P500 and S&P100 indices. I document the risk premium parameter to be significant only for the CRSP index when using the robust NL-ILS estimator. We argue that this comes from the wider composition of the CRPS index, resembling the market more accurately, when compared to the S&P500 and S&P100 indices. This nding holds on daily, weekly and monthly frequencies and it is corroborated by a series of robustness checks. Chapter 4 assesses the evolution of the risk premium parameter over time. To this purpose, we introduce a new class of volatility-in-mean model, the time-varying GARCH-in-mean (TVGARCH-in-mean) model, that allows the risk premium parameter to evolve stochastically as a random walk process. We show that the kernel based NL-ILS estimator successfully estimates the time-varying risk premium parameter, presenting a good finite sample performance. Regarding the empirical study, we find evidences that the risk premium parameter is time-varying, oscillating over negative and positive values. Chapter 5 concludes pointing the relevance of of the use of iterative estimators rather than the standard MLE framework, as well as the contributions to the applied econometrics, financial econometrics and econometric theory literatures.
|
64 |
Assessing marketing resource allocation in retailValenti, Albert 30 June 2018 (has links)
This dissertation examines two problems retailers face when assessing their marketing resource allocation. In the first chapter, I develop a conceptual framework and modeling approach to help retailers assess how online and offline marketing effectiveness vary by channel, customer value segment, and country. In the main application, using a retail dataset from six countries, I estimate Hierarchical Linear and Cross-Random Effects models to find that own- and cross-channel sales responsiveness to online and offline marketing varies by value segment and country. Specifically, direct mail drives offline sales for prospects cross all countries and email drives both online and offline sales across customer segments in half of the countries. Customer value is the key driver of offline sales, while country is a key driver of online sales. I validate the findings with a second retail dataset and a field experiment. The different elasticities and customer segment sizes across countries imply a different marketing resource allocation from status-quo. The findings generate actionable implications for theory and managers.
In the second chapter, I quantify the simultaneous effects of customer satisfaction (CS) and employee satisfaction (ES) on cross-buying. I jointly model the effects of CS and ES on cross-buying probability controlling for customer heterogeneity and time effects. The model accounts for nonlinear and asymmetric effects of satisfaction. Moreover, I examine whether the effects of CS and ES on cross-buying are non-monotonic. I employ panel datasets at individual customer and employee level on transactions and satisfaction of a leading car rental company. The results of the empirical analysis lead to four main findings. First, CS and ES have simultaneous effects on cross-buying probability. Second, the effect size of ES is about 2.7 times larger than the effect of CS. Third, the relationship between satisfaction and cross-buying is concave non-monotonic. For low satisfaction levels, an increase in satisfaction leads to higher cross-buying. However, for high levels of satisfaction, an increase in satisfaction leads to lower cross-buying. Fourth, CS and ES do not have an interaction effect on cross-buying probability.
|
65 |
Three Essays on Identification in MicroeconometricsKim, Ju Hyun January 2014 (has links)
My dissertation consists of three chapters that concern identification in microeconometrics. The first two chapters discuss partial identification of distributional treatment effects in the causal inference models. The third chapter, which is joint work with Pierre-Andre Chiappori, studies identification of structural parameters in collective consumption models in labor economics.
In the first chapter, I consider partial identification of the distribution of treatment effects when the marginal distributions of potential outcomes are fixed and restrictions are imposed on the support of potential outcomes. Examples of such support restrictions include monotone treatment response, concave or convex treatment response, and the Roy model of self-selection. Establishing informative bounds on the DTE is difficult because it involves constrained optimization over the space of joint distributions. I formulate the problem as an optimal transportation linear program and develop a new dual representation to characterize the general identification region with respect to the marginal distributions. I use this result to derive informative bounds for economic examples. I also propose an estimation procedure and illustrate the usefulness of my approach in the context of an empirical analysis of the effects of smoking on infant birth weight. The empirical results show that monotone treatment response has substantial identifying power for the DTE when the marginal distributions are given.
In the second chapter, I study partial identification of distributional parameters in nonparametric triangular systems. The model consists of an outcome equation and a selection equation. It allows for general unobserved heterogeneity and selection on unobservables. The distributional parameters that I consider are the marginal distributions of potential outcomes, their joint distribution, and the distribution of treatment effects. I explore different types of plausible restrictions to tighten existing bounds on these parameters. My identification applies to the whole population without a full support condition on instrumental variables and does not rely on parametric specifications or rank similarity. I also provide numerical examples to illustrate identifying power of each restriction.
The third chapter is joint work with Pierre-Andre Chiappori. In it, we identify the heterogeneous sharing rule in collective models. In such models, agents have their own preferences, and make Pareto efficient decisions. The econometrician can observe the household's (aggregate) demand, but not individual consumptions. We consider identification of `cross sectional' collective models, in which prices are constant over the sample. We allow for unobserved heterogeneity in the sharing rule and measurement errors in the household demand of each good. We show that nonparametric identification obtains except for particular cases (typically, when some of the individual Engel curves are linear). The existence of two exclusive goods is sufficient to identify the sharing rule, irrespective of the total number of commodities.
|
66 |
Three essays in applied econometricsLin, Yanjun January 2016 (has links)
This thesis presents three essays in the field of applied econometrics. In the first essay, we use the establishment-level Annual Respondents Database (ARD) data and the sector-level Confederation of British Industry (CBI) Industrial Trends Survey data to identify the key determinants of U.K. manufacturing investment. We first examine the trends in the ARD microdata aggregates, the relative price of investment goods data, and the CBI survey data. Subsequently, we estimate a baseline dynamic error correction investment model which separates out short-run and long-run investment dynamics. When we introduce additional variables derived from the CBI survey data to the baseline model, the estimation results show that survey variables pertaining to financing constraints and demand uncertainty have negative effects on investment, while the survey variable related to the volume of total new orders has a positive effect on investment. In the second essay, we develop forecasting models for aggregate U.K. manufacturing investment. After assessing the CBI's forecasting record over the recent financial crisis, we conclude that CBI forecasters were slow in realizing the severe negative effect of the credit crisis on manufacturing investment. Subsequently, we develop our own baseline error-correction forecasting model, which conditions only on lagged explanatory variables, and apply the general-to-specific modeling approach to simplify the model. However, the selected baseline specification has poor out-of-sample forecast properties over the crisis period. When we include additional CBI survey variables in the baseline model, there is an improvement in the out-ofsample forecast performance in most cases. Survey measures of business optimism and expected future demand are found to be particularly useful in this context. Finally, in the third essay, we employ a Threshold Vector Autoregression (TVAR) model to examine the potentially nonlinear impact of fiscal stimulus on output under tight and loose credit supply conditions in the U.S. In our main specification, we choose the excess bond premium as the threshold variable to identify periods of tight credit and loose credit. The empirical results suggest that government spending increases are more effective at stimulating output than tax cuts, especially when credit conditions are loose.
|
67 |
Determinants of household borrowing in the United States and a panel of OECD countries prior to the financial crisis of 2007/2008Wildauer, Rafael January 2017 (has links)
This thesis investigates which factors drove the spectacular accumulation of household liabilities prior to the financial crisis of 2007/2008 in the United States and 12 other OECD countries. Two mechanisms are of particular interest. The first is the polarization in the distribution of income which can be observed since the 1980s in the United States and European OECD countries (Atkinson et al. 2011). The second mechanism is the increase in asset prices and in particular residential real estate prices, which is considered by some authors as a major explanation of household debt accumulation (Mian & Sufi 2011). Based on two different data sources, the Survey of Consumer Finances (SCF) and a macro panel of 13 OECD countries, an extremely robust result emerges: The residential housing market is the key driver of household debt accumulation. This finding strongly supports an asset-focused view of household sector debt and discourages explanations which rely on a direct link from the distribution of income to household indebtedness. There is some evidence that the income polarization contributed to higher debt-levels in the US but this positive relationship is conditional on homeownership. The interpretation of these findings is that real estate purchases represent the single most important reason for households to take on debt and even if households take on debt for other reasons, real estate collateral is often a binding requirement to be granted a loan. The thesis contributes to the existing literature in three ways. First, it is the first attempt to investigate the inequality as well as the asset mechanism in a unified framework whereas previous studies analysed them in isolation. Second, by constructing a borrowing measure from SCF data the unmatched coverage of the top tail of the wealth distribution of this dataset can be exploited. Third, the thesis provides some methodological insights when using survey data to analyse household borrowing behaviour: increased model fit by the separation of borrowing and non-borrowing households; necessity to separately control for asset purchases; disadvantages of growth rates and logarithmic differences.
|
68 |
Testing and estimating structural change in misspecified linear models.January 1997 (has links)
Leung Wai-Kit. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1997. / Includes bibliographical references (leaves 84-89). / Chapter 1 --- Acknowledgment --- p.6 / Chapter I --- Introduction and a Structural Change Model --- p.7 / Chapter 2 --- Introduction --- p.7 / Chapter 3 --- A Structural Change Model and the Estimated Specification --- p.10 / Chapter II --- Behavior of the Model under Stationarity --- p.13 / Chapter 4 --- Assumptions for Stationary Regressors and Error --- p.13 / Chapter 5 --- Consistency of the Break Point Estimator when Regressors and Error are Stationary and Correlated --- p.14 / Chapter 6 --- Limiting Distribution of the Break Point Estimator when Regressors and Error are Stationary and Correlated --- p.19 / Chapter 7 --- Sup-Wald Test when Regressors and Error are Stationary and Correlated --- p.21 / Chapter III --- Behavior of the Model under Nonstationarity --- p.23 / Chapter 8 --- Assumptions for Nonstationary Regressors and I(d) Error --- p.23 / Chapter 9 --- Consistency of the Break Point Estimator under Nonstationary Regres- sors and I(d) Error --- p.26 / Chapter 10 --- F Test under Nonstationary Regressors and I(d) Error --- p.31 / Chapter IV --- Finite Sample Properties and Conclusion --- p.33 / Chapter 11 --- Finite Sample Properties of the Break Point Estimator --- p.33 / Chapter 12 --- Conclusion --- p.38 / Chapter V --- Appendix and Reference --- p.40 / Chapter 13 --- Appendix --- p.40 / Chapter 14 --- References --- p.84
|
69 |
Three Essays in EconometricsTuzcuoglu, Kerem January 2017 (has links)
This dissertation contains both theoretical and applied econometric work. The applications are on finance and macroeconomics. Each chapter utilizes time series techniques to analyze dynamic characteristics of data. The first chapter is on composite likelihood (CL) estimation, which has gained a lot of attention in the statistics field but is a relatively new technique to the economics literature. I study its asymptotic properties in a complex dynamic nonlinear model and use it to analyze corporate bond ratings. The second chapter explores the importance of global food price fluctuations. In particular, I measure the effects of global food shocks on domestic macroeconomic variables for a large number of countries. The third chapter proposes a method to interpret latent factors in a data-rich environment. In the application, I find five meaningful factor driving the US economy.
Chapter 1, persistent discrete data are modeled by Autoregressive Probit model and estimated by CL estimation. Autocorrelation in the latent variable results in an intractable likelihood function containing high dimensional integrals. CL approach offers a fast and reliable estimation compared to computationally demanding simulation methods. I provide consistency and asymptotic normality results of the CL estimator and use it to study the credit ratings. The ratings are modeled as imperfect measures of the latent and autocorrelated creditworthiness of firms explained by the balance sheet ratios and business cycle variables. The empirical results show evidence for rating assignment according to Through-the-cycle methodology, that is, the ratings do not respond to the short-term fluctuations in the financial situation of the firms. Moreover, I show that the ratings become more volatile over time, in particular after the crisis, as a reaction to the regulations and critics on credit rating agencies.
Chapter 2, which is a joint work with Bilge Erten, explores the sources and effects of global shocks that drive global food prices. We examine this question using a sign-restricted SVAR model and rich data on domestic output and its components for 82 countries from 1980 to 2011. After identifying the relevant demand and supply shocks that explain fluctuations in real food prices, we quantify their dynamic effects on net food-importing and food-exporting economies. We find that global food shocks have contractionary effects on the domestic output of net food importers, and they are transmitted through deteriorating trade balances and declining household consumption. We document expansionary and shorter-lived effects for net food exporters. By contrast, positive global demand shocks that also increase real food prices stimulate the domestic output of both groups of countries. Our results indicate that identifying the source of a shock that affects global food prices is crucial to evaluate its domestic effects. The adverse effects of global food shocks on household consumption are larger for net food importers with relatively high shares of food expenditures in household budgets and those with relatively high food trade deficits as a share of total food trade. Finally, we find that global food and energy shocks jointly explain 8 to 14 percent of the variation in domestic output.
Chapter 3, which is a joint work with Sinem Hacioglu, exploits a data rich environment to propose a method to interpret factors which are otherwise difficult to assign economic meaning to by utilizing a threshold factor-augmented vector autoregression (FAVAR) model. We observe the frequency of the factor loadings being induced to zero when they fall below the estimated threshold to infer the economic relevance that the factors carry. The results indicate that we can link the factors to particular economic activities, such as real activity, unemployment, without any prior specification on the data set. By exploiting the flexibility of FAVAR models in structural analysis, we examine impulse response functions of the factors and individual variables to a monetary policy shock. The results suggest that the proposed method provides a useful framework for the interpretation of factors and associated shock transmission.
|
70 |
Essays in financial econometrics and forecastingSmetanina, Ekaterina January 2018 (has links)
This dissertation deals with issues of forecasting in financial markets. The first part of my dissertation is motivated by the observation that most parametric volatility models follow Engle's (1982) original idea of modelling the volatility of asset returns as a function of only past information. However, current returns are potentially quite informative for forecasting, yet are excluded from these models. The first and second chapters of this dissertation try to address this question from both a theoretical and an empirical perspective. The second part of this dissertation deals with the important issue of forecast evaluation and selection in unstable environments, where it is known that the existing methodology can generate spurious and potentially misleading results. In my third chapter, I develop a new methodology for forecast evaluation and selection in such an environment. In the first chapter, $\textit{Real-time GARCH}$, I propose a new parametric volatility model, which retains the simple structure of GARCH models, but models the volatility process as a mixture of past and current information as in the spirit of Stochastic Volatility (SV) models. This provides therefore a link between GARCH and SV models. I show that with this new model I am able to obtain better volatility forecasts than the standard GARCH-type models; improve the empirical fit of the data, especially in the tails of the distribution; and make the model faster in its adjustment to the new unconditional level of volatility. Further, the new model offers a much needed framework for specification testing as it nests the standard GARCH models. This chapter has been published in the $\textit{Journal of Financial Econometrics}$ (Smetanina E., 2017, Real-time GARCH, $\textit{Journal of Financial Econometrics}$, 15(4), 561-601.) In chapter 2, $\textit{Asymptotic Inference for Real-time GARCH(1,1) model}$, I investigate the asymptotic properties of the Gaussian Quasi-Maximum-Likelihood estimator (QMLE) for the Real-time GARCH(1,1) model, developed in the first chapter of this dissertation. I establish the ergodicity and $\beta$-mixing properties of the joint process for squared returns and the volatility process. I also prove strong consistency and asymptotic normality for the parameter vector at the usual $\sqrt{T}$ rate. Finally, I demonstrate how the developed theory can be viewed as a generalisation of the QMLE theory for the standard GARCH(1,1) model. In chapter 3, $\textit{Forecast Evaluation Tests in Unstable Environments}$, I develop a new methodology for forecast evaluation and selection in the situations where the relative performance between models changes over time in an unknown fashion. Out-of-sample tests are widely used for evaluating models' forecasts in economics and finance. Underlying these tests is often the assumption of constant relative performance between competing models, however this is invalid for many practical applications. In a world of changing relative performance, previous methodologies give rise to spurious and potentially misleading results, an example of which is the well-known ``splitting point problem''. I propose a new two-step methodology designed specifically for forecast evaluation in a world of changing relative performance. In the first step I estimate the time-varying mean and variance of the series for forecast loss differences, and in the second step I use these estimates to construct new rankings for models in a changing world. I show that the new tests have high power against a variety of fixed and local alternatives.
|
Page generated in 0.0656 seconds