• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 46
  • 12
  • 5
  • 3
  • 1
  • Tagged with
  • 73
  • 73
  • 16
  • 15
  • 12
  • 12
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

The effects of economic variables in the UK stock market

Leone, Vitor January 2006 (has links)
This thesis examines the links between economic time-series innovations and statistical risk factors in the UK stock market using principal components analysis (PCA) and the general-to-specific (Gets) approach to econometric modelling. A multi-factor risk structure for the UK stock market is assumed, and it is found that the use of economic 'news' (innovations), PCA, the Gets approach, and different stock grouping criteria helps to explain the relationships between stock returns and economic variables. The Kalman Filter appears to be more appropriate than first-differencing or ARIMA modelling as a technique for estimating innovations when applying the Gets approach. Different combinations of economic variables appear to underpin the risk structure of stock returns for different sub-samples. Indications of a possible influence of firm size are found in principal components when different stock sorting criteria are used, but more definite conclusions require simultaneous sorting by market value and beta. Overall it appears that the major factor affecting the identification of specific explanatory economic variables across different sub-samples is the general economic context of investment. The influence of firm size on stock returns seems in particular to be highly sensitive to the wider economic context. There is an apparent instability in the economic underpinnings of the risk structure of stock returns (as measured by principal components) that might also be a result of changing economic conditions.
42

[en] FORECASTING LARGE REALIZED COVARIANCE MATRICES: THE BENEFITS OF FACTOR MODELS AND SHRINKAGE / [pt] PREVISÃO DE MATRIZES DE COVARIÂNCIA REALIZADA DE ALTA DIMENSÃO: OS BENEFÍCIOS DE MODELOS DE FATORES E SHRINKAGE

DIEGO SIEBRA DE BRITO 19 September 2018 (has links)
[pt] Este trabalho propõe um modelo de previsão de matrizes de covariância realizada de altíssima dimensão, com aplicação para os componentes do índice S e P 500. Para lidar com o altíssimo número de parâmetros (maldição da dimensionalidade), propõe-se a decomposição da matriz de covariância de retornos por meio do uso de um modelo de fatores padrão (e.g. tamanho, valor, investimento) e uso de restrições setoriais na matriz de covariância residual. O modelo restrito é estimado usando uma especificação de vetores auto regressivos heterogêneos (VHAR) estimados com LASSO (Least Absolute Shrinkage and Selection Operator). O uso da metodologia proposta melhora a precisão de previsão em relação a benchmarks padrões e leva a melhores estimativas de portfólios de menor variância. / [en] We propose a model to forecast very large realized covariance matrices of returns, applying it to the constituents of the S and P 500 on a daily basis. To deal with the curse of dimensionality, we decompose the return covariance matrix using standard firm-level factors (e.g. size, value, profitability) and use sectoral restrictions in the residual covariance matrix. This restricted model is then estimated using Vector Heterogeneous Autoregressive (VHAR) models estimated with the Least Absolute Shrinkage and Selection Operator (LASSO). Our methodology improves forecasting precision relative to standard benchmarks and leads to better estimates of the minimum variance portfolios.
43

Detectando não-linearidades nos retornos dos fundos multimercados

Szklo, Renato Salem 24 May 2007 (has links)
Made available in DSpace on 2008-05-13T13:47:35Z (GMT). No. of bitstreams: 1 2235.pdf: 287767 bytes, checksum: fa47b402b80359b85773581138f77c99 (MD5) Previous issue date: 2007-05-24 / The recent literature shows that an array of strategies used by hedge funds generates non-linear returns. Following the methodology proposed in Agarwal and Naik (2004), this article shows a number of Brazilian hedge funds presents result that are similar to the Bovespa put and call strategy. Using a factor model, we introduce an index based on the options performance, therefore we can show this especific variable can explain better than the traditional risk factors the non linearity of the hedge funds` returns. / Estudos recentes apontam que diversas estratégias implementadas em hedge funds geram retornos com características não lineares. Seguindo as sugestões encontradas no paper de Agarwal e Naik (2004), este trabalho mostra que uma série de hedge funds dentro da indústria de fundos de investimentos no Brasil apresenta retornos que se assemelham ao de uma estratégia em opções de compra e venda no índice de mercado Bovespa. Partindo de um modelo de fatores, introduzimos um índice referenciado no retorno sobre opções de modo que tal fator possa explicar melhor que os tradicionais fatores de risco a característica não linear dos retornos dos fundos de investimento.
44

Nowcasting Brazilian GDP

Mattos, Pedro Montero 16 August 2017 (has links)
Submitted by Pedro Montero Mattos (pmattos90@gmail.com) on 2017-09-05T14:09:34Z No. of bitstreams: 1 nowcasting-brazilian-gdp-ficha.pdf: 808279 bytes, checksum: 3b790fa6a2be106b618a354ab1f18650 (MD5) / Rejected by Joana Martorini (joana.martorini@fgv.br), reason: Prezado Pedro, boa noite, Seu trabalho não condiz com as normas necessárias para aprovação. Favor corrigir para que possamos aceitar o arquivo. Na capa faltou o nome completo da Escola, e ao identificar o local, na parte inferior da página, colocar somente o nome da cidade e o ano, retirar páginas em branco. Dissertação, banca examinadora, data da aprovação, campo de conhecimento devem estar ao lado inferior direito da página e deve haver um resumo em português. No link abaixo, a partir da página 11, tem o modelo dos requisitos necessários que podem auxiliá-lo: http://sistema.bibliotecas-sp.fgv.br/sites/bibliotecas.fgv.br/files/bibnormas1.pdf Se preferir, entre em contato pelo telefone: Thais Oliveira Cursos de Pós-Graduação (55 11) 3799-7764 SRA - Secretaria de Registros Acadêmicos on 2017-09-05T22:51:54Z (GMT) / Submitted by Pedro Montero Mattos (pmattos90@gmail.com) on 2017-09-06T18:29:03Z No. of bitstreams: 1 nowcasting-brazilian-gdp-final.pdf: 2797146 bytes, checksum: bc06f3221f99621eef79ac27ea0570ed (MD5) / Rejected by Joana Martorini (joana.martorini@fgv.br), reason: Prezado, boa tarde. Seu trabalho foi rejeitado pelo seguintes motivos: - O título na capa, contracapa e dissertação devem ser em negrito; - A numeração de páginas começa a partir da Introdução; - As Dissertações, Data da Aprovação e Banca Examinadora devem estar ao lado direito da página. Favor fazer a correção para que possamos aprovar o item. Qualquer dúvida entrar em contato no mestradoprofissional@fgv.br ou ligue 3799-7764 Att. on 2017-09-11T17:59:58Z (GMT) / Submitted by Pedro Montero Mattos (pmattos90@gmail.com) on 2017-09-11T20:14:04Z No. of bitstreams: 1 nowcasting-brazilian-gdp-final.pdf: 1435677 bytes, checksum: f7158565f421de4eacc385ee98d3348b (MD5) / Rejected by Joana Martorini (joana.martorini@fgv.br), reason: Prezado Pedro, boa noite. O trabalho está correto, exceto pela numeração de páginas, começa a partir da "Introdução", mas com o número de páginas certo, que no caso do seu arquivo seria "14" contando a partir da folha de rosto. Favor fazer a correção para que possamos aprovar o item. Grata. on 2017-09-11T21:34:20Z (GMT) / Submitted by Pedro Montero Mattos (pmattos90@gmail.com) on 2017-09-11T21:53:46Z No. of bitstreams: 1 nowcasting-brazilian-gdp-final.pdf: 1436704 bytes, checksum: 65555fa1bc7c021e54edc92cf70d35f2 (MD5) / Approved for entry into archive by Joana Martorini (joana.martorini@fgv.br) on 2017-09-11T22:28:54Z (GMT) No. of bitstreams: 1 nowcasting-brazilian-gdp-final.pdf: 1436704 bytes, checksum: 65555fa1bc7c021e54edc92cf70d35f2 (MD5) / Made available in DSpace on 2017-09-12T17:16:25Z (GMT). No. of bitstreams: 1 nowcasting-brazilian-gdp-final.pdf: 1436704 bytes, checksum: 65555fa1bc7c021e54edc92cf70d35f2 (MD5) Previous issue date: 2017-08-16 / Based on recent surveys on nowcasting methods, we apply the one-step estimation of dynamic factor models to the Brazilian case. Such methodology copes well with the problems of mixed-frequency series, ragged edges, timeliness and high dimensionality of data sets. We use the daily expectation published by the Brazilian Central Bank as a benchmark for our model and we do not find enough evidence to reject that both models have equal predictive accuracy, under non-distressed circumstances. / Baseado em recentes pesquisas em métodos de Nowcasting, foi aplicada a estimação de modelos de fatores dinâmicos em um passo ao caso brasileiro. Esta metodologia lida com os problemas de frequências mistas, amostras recortadas, horizonte temporal e alta dimensão da amostra. Foram utilizadas as expectativas diárias do PIB publicadas pelo Banco Central como um benchmark do modelo. Não foram encontradas evidências que rejeitam a hipótese de igual poder preditivo, para circunstâncias econômicas não estressadas.
45

[en] FORECASTING IN HIGH-DIMENSION: INFLATION AND OTHER ECONOMIC VARIABLES / [pt] PREVISÃO EM ALTA DIMENSÃO: INFLAÇÃO E OUTRAS VARIÁVEIS ECONÔMICAS

GABRIEL FILIPE RODRIGUES VASCONCELOS 26 September 2018 (has links)
[pt] Esta tese é composta de quatro artigos e um pacote de R. Todos os artigos têm como foco previsão de variáveis econômicas em alta dimensão. O primeiro artigo mostra que modelos LASSO são muito precisos para prever a inflação brasileira em horizontes curtos de previsão. O segundo artigo utiliza vários métodos de Machine Learning para prever um grupo de variáveis macroeconomicas americanas. Os resultados mostram que uma adaptação no LASSO melhora as previsões com um alto custo computacional. O terceiro artigo também trata da previsão da inflação brasileira, mas em tempo real. Os principais resultados mostram que uma combinação de modelos de Machine Learning é mais precisa do que a previsão do especialista (FOCUS). Finalmente, o último artigo trata da previsão da inflação americana utilizando um grande conjunto de modelos. O modelo vencedor é o Random Forest, que levanta a questão da não-linearidade na inflação americana. Os resultados mostram que tanto a não-linearidade quanto a seleção de variáveis são importantes para os bons resultados do Random Forest. / [en] This thesis is made of four articles and an R package. The articles are all focused on forecasting economic variables on high-dimension. The first article shows that LASSO models are very accurate to forecast the Brazilian inflation in small horizons. The second article uses several Machine Learning models to forecast a set o US macroeconomic variables. The results show that a small adaptation in the LASSO improves the forecasts but with high computational costs. The third article is also on forecasting the Brazilian inflation, but in real-time. The main results show that a combination of Machine Learning models is more accurate than the FOCUS specialist forecasts. Finally, the last article is about forecasting the US inflation using a very large set of models. The winning model is the Random Forest, which opens the discussion of nonlinearity in the US inflation. The results show that both nonlinearity and variable selection are important features for the Random Forest performance.
46

[en] CONTRIBUTIONS TO THE ECONOMETRICS OF COUNTERFACTUAL ANALYSIS / [pt] CONTRIBUIÇÕES PARA A ECONOMETRIA DE ANÁLISE CONTRAFACTUAL

RICARDO PEREIRA MASINI 10 July 2017 (has links)
[pt] Esta tese é composta por três capítulos que abordam a econometria de análise contrafactual. No primeiro capítulo, propomos uma nova metodologia para estimar efeitos causais de uma intervenção que ocorre em apenas uma unidade e não há um grupo de controle disponível. Esta metodologia, a qual chamamos de contrafactual artificial (ArCo na sigla em inglês), consiste em dois estágios: no primeiro um contrafactual é estimado através de conjuntos de alta dimensão de variáveis das unidades não tratadas, usando métodos de regularização como LASSO. No segundo estágio, estimamos o efeito médio da intervenção através de um estimador consistente e assintoticamente normal. Além disso, nossos resultados são válidos uniformemente para um grande classe the distribuições. Como uma ilustração empírica da metodologia proposta, avaliamos o efeito de um programa antievasão fiscal. No segundo capítulo, investigamos as consequências de aplicar análises contrafactuais quando a amostra é gerada por processos integrados de ordem um. Concluímos que, na ausência de uma relação de cointegração (caso espúrio), o estimador da intervenção diverge, resultando na rejeição da hipótese de efeito nulo em ambos os casos, ou seja, com ou sem intervenção. Já no caso onde ao menos uma relação de cointegração exista, obtivemos um estimador consistente, embora, com uma distribuição limite não usual. Como recomendação final, sugerimos trabalhar com os dados em primeira diferença para evitar resultados espúrios sempre que haja possibilidade de processos integrados. Finalmente, no último capítulo, estendemos a metodologia ArCo para o caso de estimação de efeitos quantílicos condicionais. Derivamos uma estatística de teste assintoticamente normal para inferência, além de um teste distribucional. O procedimento é, então, adotado em um exercício empírico com o intuito de investigar os efeitos do retorno de ações após uma mudança do regime de governança corporativa. / [en] This thesis is composed of three chapters concerning the econometrics of counterfactual analysis. In the first one, we consider a new, exible and easy-to-implement methodology to estimate causal effects of an intervention on a single treated unit when no control group is readily available, which we called Artificial Counterfactual (ArCo). We propose a two-step approach where in the first stage a counterfactual is estimated from a largedimensional set of variables from a pool of untreated units using shrinkage methods, such as the Least Absolute Shrinkage Operator (LASSO). In the second stage, we estimate the average intervention effect on a vector of variables, which is consistent and asymptotically normal. Moreover, our results are valid uniformly over a wide class of probability laws. As an empirical illustration of the proposed methodology, we evaluate the effects on in ation of an anti tax evasion program. In the second chapter, we investigate the consequences of applying counterfactual analysis when the data are formed by integrated processes of order one. We find that without a cointegration relation (spurious case) the intervention estimator diverges, resulting in the rejection of the hypothesis of no intervention effect regardless of its existence. Whereas, for the case when at least one cointegration relation exists, we have a square root T-consistent estimator for the intervention effect albeit with a non-standard distribution. As a final recommendation we suggest to work in first-differences to avoid spurious results. Finally, in the last chapter we extend the ArCo methodology by considering the estimation of conditional quantile counterfactuals. We derive an asymptotically normal test statistics for the quantile intervention effect including a distributional test. The procedure is then applied in an empirical exercise to investigate the effects on stock returns after a change in corporate governance regime.
47

Essays on Monetary Policy, Low Inflation and the Business Cycle

Conti, Antoniomaria 16 November 2017 (has links)
The last ten years have been extremely challenging for both researchers in monetary economics and policymakers.The Global Financial Crisis of 2007-2009, in spite of its size and severity, was initially widely perceived in the Euro Area (EA) as an imported and transitory crisis: it was frequently predicted that the EA economy would recover once the US and the World Economy rebounded. Instead, after a brief period of recovery, the Euro Area was hit by the Sovereign Debt Crisis of 2011-12, a domestic crisis which widened the divide already existing between core and peripheral countries up to the point of threatening a break-up of the euro. Thanks to the bold monetary policy response of the ECB this fear gradually vanished, but the sudden fall in oil price and the uncertain economic outlook led to the low inflation period, particularly severe in the EA, in which inflation, both in terms of headline and core measures, is well below the ECB target of 2%. This prompted the ECB to launch its Quantitative Easing program, at the beginning of 2015, much later than what the FED implemented to offset the impact of the 2007-09 crisis.This dissertation consists of two different but interlinked parts, which contribute to the empirical literature on monetary policy, low inflation and the business cycle. The first part is composed by Chapters I and II, and it is devoted to analyse the EA economy, both before the Global Financial Crisis and during the most recent low inflation period. The second one, composed by Chapters III and IV, focuses on the US economy to evaluate the possible negative consequences of the extraordinary monetary stimulus undertaken by the FED. In particular, we study the risks for both price and financial stability of the effects of the so called lift-off, i.e. the gradual normalization of monetary stance. In the first Chapter, we provide novel evidence on the different effects of the ECB common monetary policy on euro-area core and peripheral countries even before the eruption of the crisis.We estimate a structural dynamic factor model on a large panel of Euro Area quarterly variables to take into account both the comovement and the heterogeneity in the EA business cycle, and we then simulate the model to investigate the possible existence of asymmetric effects of ECB monetary policy on member states' economies. Data stop before the eruption of the Global Financial Crisis in order to only assess conventional monetary shocks, which are identified by means of sign restrictions. Although the introduction of the euro has changed the monetary transmission mechanism in the individual countries towards a more homogeneous response, we find that differences still remain between North and South Europe in terms of prices and unemployment. These results are the consequence of country-specific structures, rather than of European Central Bank policies.In the second Chapter we use a Bayesian VAR model to analyse the transmission of global and domestic shocks in the euro area, with a particular focus on the drivers of inflation, especiallyin the recent period labeled as low inflation. We identify several shocks by means of sign restrictions, and we account for the role of ECB unconventional monetary policies by using a shadow interest rate. We document that the recent low inflation phase was not entirely attributable to falling oil prices, but also to slack in economic activity and to insufficiently expansionary monetary policy, because of the Zero Lower Bound of interest rates. Interestingly, we show that the launch of the ECB Quantitative Easing turned the monetary stance into more accommodative, preventing deflationary outcomes. In the third Chapter we provide an empirical evaluation of the existence of a "dark side" of monetary policy, i.e. the possibility that credit spreads abruptly rise following a monetary tightening, after being compressed by an extraordinary period of monetary easing. This would create a problematic trade--off for the central bank, as temporary monetary expansions might at once stimulate the economy and sow the seeds of abrupt and costly financial market corrections in the future in terms of risks for financial stability (Stein, 2014).We investigate this possibility using data for the US by exploiting non-linear methods to examine the propagation of monetary shocks through US corporate bond markets. Across different methodologies, we find that the transmission of monetary shocks is mostly symmetric. What is asymmetric is instead the impact of macroeconomic data releases: spreads respond more to bad news. Crucially, these responses anticipate economic slowdowns rather than causing them directly.However, empirical evidence points to the possibility of larger effects of expansionary monetary shocks depending on (i) the type of non-linear estimation technique (ii) the identification of the shock and (iii) the inclusion of unconventional measures in the analysis. Finally, in the fourth Chapter, we ask whether the FED has riskily delayed the exit from its large monetary easing, increasing the probability of a future inflationary burst. We do so by means of medium and larger scale Bayesian VAR, which we use for both structural analysis, i.e. the evaluation of monetary policy shocks, and forecasting, i.e. the running of counterfactuals and scenario analysis.We show that expansionary monetary policy did not trigger a large deviation of inflation from its steady state. Furthermore, the FED monetary stance is totally in line with the concurrent macroeconomic dynamics. Last, our model predicts that US core inflation will lie well below its 2% target in 2017, a finding only recently acknowledged by the FOMC projections. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
48

Structural models for macroeconomics and forecasting

De Antonio Liedo, David 03 May 2010 (has links)
This Thesis is composed by three independent papers that investigate<p>central debates in empirical macroeconomic modeling.<p><p>Chapter 1, entitled “A Model for Real-Time Data Assessment with an Application to GDP Growth Rates”, provides a model for the data<p>revisions of macroeconomic variables that distinguishes between rational expectation updates and noise corrections. Thus, the model encompasses the two polar views regarding the publication process of statistical agencies: noise versus news. Most of the studies previous studies that analyze data revisions are based<p>on the classical noise and news regression approach introduced by Mankiew, Runkle and Shapiro (1984). The problem is that the statistical tests available do not formulate both extreme hypotheses as collectively exhaustive, as recognized by Aruoba (2008). That is, it would be possible to reject or accept both of them simultaneously. In turn, the model for the<p>DPP presented here allows for the simultaneous presence of both noise and news. While the “regression approach” followed by Faust et al. (2005), along the lines of Mankiew et al. (1984), identifies noise in the preliminary<p>figures, it is not possible for them to quantify it, as done by our model. <p><p>The second and third chapters acknowledge the possibility that macroeconomic data is measured with errors, but the approach followed to model the missmeasurement is extremely stylized and does not capture the complexity of the revision process that we describe in the first chapter.<p><p><p>Chapter 2, entitled “Revisiting the Success of the RBC model”, proposes the use of dynamic factor models as an alternative to the VAR based tools for the empirical validation of dynamic stochastic general equilibrium (DSGE) theories. Along the lines of Giannone et al. (2006), we use the state-space parameterisation of the factor models proposed by Forni et al. (2007) as a competitive benchmark that is able to capture weak statistical restrictions that DSGE models impose on the data. Our empirical illustration compares the out-of-sample forecasting performance of a simple RBC model augmented with a serially correlated noise component against several specifications belonging to classes of dynamic factor and VAR models. Although the performance of the RBC model is comparable<p>to that of the reduced form models, a formal test of predictive accuracy reveals that the weak restrictions are more useful at forecasting than the strong behavioral assumptions imposed by the microfoundations in the model economy.<p><p>The last chapter, “What are Shocks Capturing in DSGE modeling”, contributes to current debates on the use and interpretation of larger DSGE<p>models. Recent tendency in academic work and at central banks is to develop and estimate large DSGE models for policy analysis and forecasting. These models typically have many shocks (e.g. Smets and Wouters, 2003 and Adolfson, Laseen, Linde and Villani, 2005). On the other hand, empirical studies point out that few large shocks are sufficient to capture the covariance structure of macro data (Giannone, Reichlin and<p>Sala, 2005, Uhlig, 2004). In this Chapter, we propose to reconcile both views by considering an alternative DSGE estimation approach which<p>models explicitly the statistical agency along the lines of Sargent (1989). This enables us to distinguish whether the exogenous shocks in DSGE<p>modeling are structural or instead serve the purpose of fitting the data in presence of misspecification and measurement problems. When applied to the original Smets and Wouters (2007) model, we find that the explanatory power of the structural shocks decreases at high frequencies. This allows us to back out a smoother measure of the natural output gap than that<p>resulting from the original specification. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
49

Essays on Empirical Macroeconomics

Caruso, Alberto 25 June 2020 (has links) (PDF)
The thesis contains four essays, covering topics in the field of real-time macroeconometrics, forecasting and applied macroeconomics. In the first two chapters, I use recent techniques developed in the "nowcasting" literature in order to analyse and interpret the macroeconomic news flow. I use them either to assess current macroeconomic conditions, showing the importance of foreign indicators dealing with small open economies, or linking macroeconomic news to asset prices, through a model that help us interpret macroeconomic data and explaining the linkages between macro variables and financial indicators. In the third chapter, I analyse the link between macroeconomic data in real-time and the yield curve of interest rates, constructing a forecasting model which takes into account the peculiar characteristics of the macroeconomic data flow. In the last chapter, I present a Bayesian Vector Autoregression model built in order to analyse the last two crisis in the Eurozone (2008-09, and 2011-12) identifying their unique characteristics with respect to historical regularities, an issue of great importance from a policy perspective. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
50

A New Value Premium : Value Creation in the Swedish stock market

Jalili, Lemar, Höög, Samuel, Blank, Simon January 2022 (has links)
Value creation in any stock market is a highly discussed topic with an abundant amount of generalized models aiming to predict future returns. Although no such tool exists yet there are, however, acknowledged models from peer-reviewed journals that have received a lot of attention over the years in examining company performance. This thesis is therefore built on the well-known Fama-French three-factor model. The original Fama-French three-factor model is extended by adding a new size premium and a new value premium, both based upon the spread between the return on invested capital (ROIC) – the weighted average cost of capital (WACC). The purpose of this is to make the returns of a portfolio account for cash flow and debt on top of risk, size, and value premium for a company. This thesis finds that the ROIC-WACC spread adds explanatory power to the existing Fama and French three-factor model on the Swedish stock market. The research method of this study is quantitative and deductive. The considered period is six years between the years 2014 and 2020.

Page generated in 0.2497 seconds