• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 46
  • 12
  • 5
  • 3
  • 1
  • Tagged with
  • 73
  • 73
  • 16
  • 15
  • 12
  • 12
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Business cycles in the Czech Republic: an empirical investigation / Kvantitativní analýza hospodářského cyklu v České republice

Bocák, Petr January 2012 (has links)
The aim of this thesis is to estimate monthly probability that the Czech economy is in a recession. For this purpose, I construct indexes of coincident and leading variables from multiple time series by Maximum Likelihood. Changes in coincident index are preceded by changes in the leading index by almost one year for peaks and about one month for troughs on average. To assess the probability of recession, I estimate multiple mixture models for growth rates of coincident index focusing on Markov-Switching specification for the latent business cycle process. I found that the two-state Markov-Switching AR (1) is superior to other models based on information criteria. Lagged values of leading index further improve the model fit but the model provides less clear signals of recessions compared to models based solely on coincident index.
22

ESSAYS IN HIGH-DIMENSIONAL ECONOMETRICS

Haiqing Zhao (9174302) 27 July 2020 (has links)
My thesis consists of three chapters. The first chapter uses the Factor-augmented Error Correction Model in model averaging for predictive regressions, which provides significant improvements with large datasets in areas where the individual methods have not. I allow the candidate models to vary by the number of dependent variable lags, the number of factors, and the number of cointegration ranks. I show that the leave-h-out cross-validation criterion is an asymptotically unbiased estimator of the optimal mean squared forecast error, using either the estimated cointegration vectors or the nonstationary regressors. Empirical results demonstrate that including cointegration relationships significantly improves long-run forecasts of a standard set of macroeconomic variables. I also estimate simulation-based prediction intervals for six real and nominal macroeconomics variables. The results are consistent with the point estimates, which further support the usefulness of cointegration in long-run forecasts.<div><br></div><div>The second chapter is a Monte Carlo study comparing the finite sample performance of six recently proposed estimation methods designed for large-dimensional regressions with endogeneity. The methods are based on combining shrinkage estimation with two-stage least squares (2SLS) or generalized method of moments(GMM), where both the number of regressors and instruments can be large. The methods are evaluated in terms of bias and mean squared error of the estimators. I consider a variety of designs with practically relevant features such as weak instruments and heteroskedasticity as well as cases where the number of observations is smaller/larger than the number of regressors/instruments. The consistency results show that the methods using GMM with shrinkage provide smaller estimation errors than the methods using 2SLS with shrinkage. Moreover, the results support the use of cross-validation to select tuning parameters if theoretically derived parameters are unavailable. Lastly, the results indicate that all instruments should correlate with at least one endogenous regressor to ensure estimation consistency.<br></div><div><br></div><div>The third chapter is coauthored with Mohitosh Kejriwal. We present new evidence on the nexus between democracy and growth employing the dynamic common correlated effects (DCCE) approach advanced by Chudik and Pesaran (2015), which is robust to both parameter heterogeneity and cross-section dependence. The DCCE results indicate a positive and statistically significant effect of democracy on economic growth, with a point estimate between approximately 1.5-2% depending on the specification. We complement our estimates with a battery of diagnostic tests for heterogeneity and cross-section dependence that corroborate the use of the DCCE approach.<br></div>
23

Duration-Weighted Carbon Footprint Metrics and Carbon Risk Factor for Credit Portfolios / Durationsvikitad måt av koldioxidsavtryck och riskfaktorer för obligationspörtföjler

Hendey Bröte, Erik January 2020 (has links)
Current standard carbon footprint metrics attribute responsibility for a firm’s green house gas (GHG) emitting activities equally between an entity’s equity and debt. This study introduces a set of novel duration-weighted metrics which take into consideration the length of financing provided. These measure show promise for reporting footprints of debt portfolios, but further study of methodological robustness should be performed before they can be adopted widely. The measures are also attractive from a risk perspective as they are linearly dependent on duration and therefore are sensitive to yields. A factor portfolio is constructed using the new carbon intensity measure, and corporate yields are studied in a linear factor model. Other factors included derive from Nelson-Siegel parameterizations of US Treasury rates and the USD swap spread curve. Following the Fama-MacBeth procedure, the carbon factor is found not to persist over the 10-year period. / Nuvarande standardmått f ̈or koldioxidsavtryck i en portfölj tilldelar ansvaret för ett företags emitterande aktiviteter av växthusgas lika mellan aktier och skulder, där finansieringens längd inte beaktas. Ett ny durationsviktat mått introduceras i denna studie och dess lämplighet som metrik för rapportering undersöks. Studien visar att detta mått har potential för rapportering i kreditportföljer, men ytterligare studier av hur robust metoden är bör utföras innan den tillämpas brett. Måttet har attraktiva egenskaper eftersom den är linjärt beroende på durationen och därmed känslig gentemot obligationsavkastningen. En faktorportfölj konstrueras med hjälp av det nya kolintensitetsmåttet, och i en linjär faktormodell studeras företagsobligationsavkastning. Andra faktorer som inkluderas i modellen härstammar från Nelson-Siegel-faktorisering av US Treasury och USD swap- spread kurvorna. CO2-faktorn utvärderas med hjälp av Fama-MacBeths tvärsnittsmetod, och det konstateras att faktorn inte visar signifikans under den 10-åriga studieperioden.
24

Statistical arbitrage: Factor investing approach

Akyildirim, Erdinc, Goncu, A., Hekimoglu, A., Nquyen, D.K., Sensoy, A. 26 September 2023 (has links)
Yes / We introduce a continuous time model for stock prices in a general factor representation with the noise driven by a geometric Brownian motion process. We derive the theoretical hitting probability distribution for the long-until-barrier strategies and the conditions for statistical arbitrage. We optimize our statistical arbitrage strategies with respect to the expected discounted returns and the Sharpe ratio. Bootstrapping results show that the theoretical hitting probability distribution is a realistic representation of the empirical hitting probabilities. We test the empirical performance of the long-until-barrier strategies using US equities and demonstrate that our trading rules can generate statistical arbitrage profits. / The full-text of this article will be released for public view at the end of the publisher embargo on 16 Sep 2024.
25

Construction et estimation de copules en grande dimension / Construction and estimation of high-dimensional copulas

Mazo, Gildas 17 November 2014 (has links)
Ces dernières décennies, nous avons assisté à l'émergence du concept de copule en modélisation statistique. Les copules permettent de faire une analyse séparée des marges et de la structure de dépendance induite par une distribution statistique. Cette séparation facilite l'incorporation de lois non gaussiennes, et en particulier la prise en compte des dépendances non linéaires entre les variables aléatoires. La finance et l'hydrologie sont deux exemples de sciences où les copules sont très utilisées. Cependant, bien qu'il existe beaucoup de familles de copules bivariées, le choix reste limité en plus grande dimension: la construction de copules multivariées/en grande dimension reste un problème ouvert aujourd'hui. Cette thèse présente trois contributions à la modélisation et à l'inférence de copules en grande dimension. Le premier modèle proposé s'écrit comme un produit de copules bivariées, où chaque copule bivariée se combine aux autres via un graphe en arbre. Elle permet de prendre en compte les différents degrés de dépendance entre les différentes paires. La seconde copule est un modèle à facteurs basé sur une classe nonparamétrique de copules bivariées. Elle permet d'obtenir un bon équilibre entre flexibilité et facilité d'utilisation. Cette thèse traite également de l'inférence paramétrique de copules dans le cas général, en établissant les propriétés asymptotiques d'un estimateur des moindres carrés pondérés basé sur les coefficients de dépendance. Les modèles et méthodes proposés sont appliqués sur des données hydrologiques (pluies et débits de rivières). / In the last decades, copulas have been more and more used in statistical modeling. Their popularity owes much to the fact that they allow to separate the analysis of the margins from the analysis of the dependence structure induced by the underlying distribution. This renders easier the modeling of non Gaussian distributions, and, in particular, it allows to take into account non linear dependencies between random variables. Finance and hydrology are two examples of scientific fields where the use of copulas is nowadays standard. However, while many bivariate families exist in the literature, multivariate/high dimensional copulas are much more difficult to construct. This thesis presents three contributions to copula modeling and inference, with an emphasis on high dimensional problems. The first model writes as a product of bivariate copulas and is underlain by a tree structure where each edge represents a bivariate copula. Hence, we are able to model different pairs with different dependence properties. The second one is a factor model built on a nonparametric class of bivariate copulas. It exhibits a good balance between tractability and flexibility. This thesis also deals with the parametric inference of copula models in general. Indeed, the asymptotic properties of a weighted least-squares estimator based on dependence coefficients are established. The models and methods have been applied to hydrological data (flow rates and rain falls).
26

Comparação de métodos de estimação de modelos de apreçamento de ativos / Comparison of methods for estimation of asset pricing models

Silva Neto, Aníbal Emiliano da 14 August 2012 (has links)
O objetivo deste trabalho é comparar diferentes formas de estimação de modelos de apreçamento de ativos. Além dos métodos tradicionais, que utilizam toda a amostra no processo de estimação dos parâmetros do modelo, será utilizado o método rolling, que estima os parâmetros através da utilização de janelas móveis de tamanho fixo. Com isso, utilizando a técnica de backtesting, procura-se averiguar se o método rolling proporciona um ganho na qualidade de ajuste em modelos de apreçamento de ativos. / The aim of this project is to compare methods of estimating asset pricing models. In addition to using traditional methods, which estimate the models parameters by using the entire sample at once, the rolling method will be used. This method estimates the models parameters by using a rolling window of fixed size through the sample. By using backtesting, we seek to investigate whether the rolling approach provides an improvement in the goodness of fit in asset pricing models.
27

Essays on Forecasting Methods and Monetary Policy Evaluation

López Buenache, Germán 27 July 2015 (has links)
No description available.
28

[en] A HIERARCHICAL FACTOR MODEL FOR THE JOINT PREDICTION OF CORPORATE BOND YIELDS / [pt] MODELO HIERÁRQUICO DE FATORES PARA A PREVISÃO CONJUNTA DAS ESTRUTURAS A TERMO DAS TAXAS DE JUROS DE CORPORATE BONDS

URSULLA MONTEIRO DA SILVA BELLOTE MACHADO 17 May 2012 (has links)
[pt] O objetivo deste trabalho é a construção de um modelo integrado para previsão da estrutura a termo da taxa de juros, referentes a títulos corporativos americanos para diferentes níveis de risco. A metodologia é baseada no modelo de Nelson e Siegel (1987), com extensões propostas por Diebold e Li (2006) e Diebold, Li e Yue (2008). Modelamos a estrutura a termo para 14 níveis de risco e estimamos conjuntamente os fatores latentes de nível e inclinação que governam a dinâmica das taxas, para a posterior estimação de dois super fatores, que por sua vez, conduzem a trajetória de cada fator, onde está centrada a nossa principal inovação. A previsão da curva de juros é então construída a partir da previsão dos super fatores, modelados por processos auto-regressivos, como sugere Diebold e Li (2006). Através dos super fatores extrapolados da amostra reconstruímos, na forma da previsão, os fatores latentes e a própria taxa de juros. Além da previsão fora da amostra, comparamos a eficiência do modelo proposto com o modelo mais tradicional da literatura, o passeio aleatório. Pela comparação, não obtivemos ganhos significativos em relação a esse competidor, principalmente na previsão um passo a frente. Resultados melhores foram obtidos aumentando o horizonte de previsão, mas não sendo capaz de superar o passeio aleatório. / [en] This dissertation constructs an integrated model for interest rate term structure forecast for American corporate bonds associated with different risk levels. Our methodology is primarily based on Nelson and Siegel (1987) and presents extensions proposed in Diebold and Li (2006) and Diebold, Li and Yue (2008). We model the term structure for 14 risk levels and we jointly estimate the level and slope latent factors that drive interest rates dynamics. These factors are then used in the estimation of two super factors which is our main innovation. The yield curve forecast is then determinate from the forecast of the super factors, described by autoregressive processes, as suggested by Diebold and Li (2006). Through the super factors forecast, reconstructed in the form of forecasting the latent factors and their own interest rate. Our results focus on the model’s out of sample forecast and efficiency compared with the random walk model, considered the benchmark model in this type of literature. Our results provide evidence that the proposed models shows no significant gains in relation to the benchmark, especially in predicting one month ahead. Better results were obtained by increasing the forecast horizon, but not being able to overcome the random walk.
29

Essays in Financial Economics and Econometrics

Bates, Brandon January 2011 (has links)
In the first essay, I study the power of predictive regressions in a world of forecastable returns and find it to be quite poor. Using a simple model, I investigate the properties of short- and long-horizon regressions. The mechanisms biasing coefficients in short-horizon regressions differ from those affecting longer horizons. Further, I demonstrate that R\(^2s\) are biased and give an estimable bias correction. A calibration exercise shows sample lengths will be insufficient to determine what predicts asset returns until beyond the year 2100. The problem is not isolated to highly persistent predictors; even modestly persistent predictors have difficulties. Further, long-horizon regressions have inferior power relative to their single-period counterparts. These results present a predicament. If return predictability exists, then our ability to identify its source using predictive regressions alone is exceedingly poor. The second essay, written with James Stock and Mark Watson, considers the estimation of approximate dynamic factor models when there is temporal instability in the factors, factor loadings, and errors. We demonstrate that estimators for the factors and for the number of those factors are consistent for their population values even when affected by these instabilities. Further, we characterize the inferential theory in our framework for the estimated factors and for diffusion index forecasts and factor-augmented vector autoregressions that make use of the estimated factors. These results illustrate the broad robustness factor models have against temporal instability. In the third essay, co-author Peter Tufano and I consider the complex accounting rules, explicit fund sponsor supports, and government actions, that grant US money market mutual fund investors an implicit put option allowing them to redeem their shares at a fixed price of $1.00, regardless of the portfolio's market value. We describe the institutional features that generate these options, identify their writers, and estimate their premia. Using a hypothetical MMMF, we find that currently, non-redeeming shareholders, fund sponsors, and the government collectively bear annual premia of 22 to 44 basis points to give MMMF shareholders the right to redeem their shares at $1.00 rather than at the market value of the fund portfolio. These premia rose dramatically during the financial crisis, with the put value potentially being over 50 basis points.
30

Is the Accruals Anomaly More Persistent in Firms With Weak Internal Controls?

Kapur, Kanishk 01 January 2018 (has links)
In 1996, Sloan identified the accruals anomaly, in which the negative relationship between the accruals component of current earnings and subsequent stock returns can be exploited to generate excess returns. One would expect the accruals anomaly to dissipate and ultimately disappear as investors take advantage of the now-public information. However, nearly two decades later, it persists as one of the most prominent and contentious anomalies; its magnitude of current and future excess returns still remain controversial. The main reason for its persistence is that extreme accrual firms possess characteristics that are unappealing to most investors. These characteristics, which include insufficient analyst coverage, high idiosyncratic volatility and the presence of institutional constraints, are generally more pronounced in firms with weak internal controls. This paper finds that the accruals anomaly persists at a higher magnitude in firms with weak internal controls. This higher magnitude of excess returns survives the Fama-French five-factor (2015), the Stambaugh-Yuan four-factor (2017) and the Hou, Xue, and Zhang (2015) q-factor models.

Page generated in 0.066 seconds