• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 323
  • 258
  • 72
  • 53
  • 47
  • 40
  • 32
  • 30
  • 20
  • 11
  • 9
  • 7
  • 6
  • 4
  • 4
  • Tagged with
  • 946
  • 150
  • 125
  • 97
  • 86
  • 72
  • 61
  • 59
  • 58
  • 56
  • 56
  • 56
  • 55
  • 54
  • 52
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Essays in Dynamic Macroeconometrics

Bañbura, Marta 26 June 2009 (has links)
The thesis contains four essays covering topics in the field of macroeconomic forecasting. The first two chapters consider factor models in the context of real-time forecasting with many indicators. Using a large number of predictors offers an opportunity to exploit a rich information set and is also considered to be a more robust approach in the presence of instabilities. On the other hand, it poses a challenge of how to extract the relevant information in a parsimonious way. Recent research shows that factor models provide an answer to this problem. The fundamental assumption underlying those models is that most of the co-movement of the variables in a given dataset can be summarized by only few latent variables, the factors. This assumption seems to be warranted in the case of macroeconomic and financial data. Important theoretical foundations for large factor models were laid by Forni, Hallin, Lippi and Reichlin (2000) and Stock and Watson (2002). Since then, different versions of factor models have been applied for forecasting, structural analysis or construction of economic activity indicators. Recently, Giannone, Reichlin and Small (2008) have used a factor model to produce projections of the U.S GDP in the presence of a real-time data flow. They propose a framework that can cope with large datasets characterised by staggered and nonsynchronous data releases (sometimes referred to as “ragged edge”). This is relevant as, in practice, important indicators like GDP are released with a substantial delay and, in the meantime, more timely variables can be used to assess the current state of the economy. The first chapter of the thesis entitled “A look into the factor model black box: publication lags and the role of hard and soft data in forecasting GDP” is based on joint work with Gerhard Rünstler and applies the framework of Giannone, Reichlin and Small (2008) to the case of euro area. In particular, we are interested in the role of “soft” and “hard” data in the GDP forecast and how it is related to their timeliness. The soft data include surveys and financial indicators and reflect market expectations. They are usually promptly available. In contrast, the hard indicators on real activity measure directly certain components of GDP (e.g. industrial production) and are published with a significant delay. We propose several measures in order to assess the role of individual or groups of series in the forecast while taking into account their respective publication lags. We find that surveys and financial data contain important information beyond the monthly real activity measures for the GDP forecasts, once their timeliness is properly accounted for. The second chapter entitled “Maximum likelihood estimation of large factor model on datasets with arbitrary pattern of missing data” is based on joint work with Michele Modugno. It proposes a methodology for the estimation of factor models on large cross-sections with a general pattern of missing data. In contrast to Giannone, Reichlin and Small (2008), we can handle datasets that are not only characterised by a “ragged edge”, but can include e.g. mixed frequency or short history indicators. The latter is particularly relevant for the euro area or other young economies, for which many series have been compiled only since recently. We adopt the maximum likelihood approach which, apart from the flexibility with regard to the pattern of missing data, is also more efficient and allows imposing restrictions on the parameters. Applied for small factor models by e.g. Geweke (1977), Sargent and Sims (1977) or Watson and Engle (1983), it has been shown by Doz, Giannone and Reichlin (2006) to be consistent, robust and computationally feasible also in the case of large cross-sections. To circumvent the computational complexity of a direct likelihood maximisation in the case of large cross-section, Doz, Giannone and Reichlin (2006) propose to use the iterative Expectation-Maximisation (EM) algorithm (used for the small model by Watson and Engle, 1983). Our contribution is to modify the EM steps to the case of missing data and to show how to augment the model, in order to account for the serial correlation of the idiosyncratic component. In addition, we derive the link between the unexpected part of a data release and the forecast revision and illustrate how this can be used to understand the sources of the latter in the case of simultaneous releases. We use this methodology for short-term forecasting and backdating of the euro area GDP on the basis of a large panel of monthly and quarterly data. In particular, we are able to examine the effect of quarterly variables and short history monthly series like the Purchasing Managers' surveys on the forecast. The third chapter is entitled “Large Bayesian VARs” and is based on joint work with Domenico Giannone and Lucrezia Reichlin. It proposes an alternative approach to factor models for dealing with the curse of dimensionality, namely Bayesian shrinkage. We study Vector Autoregressions (VARs) which have the advantage over factor models in that they allow structural analysis in a natural way. We consider systems including more than 100 variables. This is the first application in the literature to estimate a VAR of this size. Apart from the forecast considerations, as argued above, the size of the information set can be also relevant for the structural analysis, see e.g. Bernanke, Boivin and Eliasz (2005), Giannone and Reichlin (2006) or Christiano, Eichenbaum and Evans (1999) for a discussion. In addition, many problems may require the study of the dynamics of many variables: many countries, sectors or regions. While we use standard priors as proposed by Litterman (1986), an important novelty of the work is that we set the overall tightness of the prior in relation to the model size. In this we follow the recommendation by De Mol, Giannone and Reichlin (2008) who study the case of Bayesian regressions. They show that with increasing size of the model one should shrink more to avoid overfitting, but when data are collinear one is still able to extract the relevant sample information. We apply this principle in the case of VARs. We compare the large model with smaller systems in terms of forecasting performance and structural analysis of the effect of monetary policy shock. The results show that a standard Bayesian VAR model is an appropriate tool for large panels of data once the degree of shrinkage is set in relation to the model size. The fourth chapter entitled “Forecasting euro area inflation with wavelets: extracting information from real activity and money at different scales” proposes a framework for exploiting relationships between variables at different frequency bands in the context of forecasting. This work is motivated by the on-going debate whether money provides a reliable signal for the future price developments. The empirical evidence on the leading role of money for inflation in an out-of-sample forecast framework is not very strong, see e.g. Lenza (2006) or Fisher, Lenza, Pill and Reichlin (2008). At the same time, e.g. Gerlach (2003) or Assenmacher-Wesche and Gerlach (2007, 2008) argue that money and output could affect prices at different frequencies, however their analysis is performed in-sample. In this Chapter, it is investigated empirically which frequency bands and for which variables are the most relevant for the out-of-sample forecast of inflation when the information from prices, money and real activity is considered. To extract different frequency components from a series a wavelet transform is applied. It provides a simple and intuitive framework for band-pass filtering and allows a decomposition of series into different frequency bands. Its application in the multivariate out-of-sample forecast is novel in the literature. The results indicate that, indeed, different scales of money, prices and GDP can be relevant for the inflation forecast.
152

Volatilitetsmission : En studie av aktiemarknaderna i Sverige, Tyskland, England, Japan och USA

Borgström, Anders January 2005 (has links)
Denna uppsats ämnar undersöka hur volatilitetstransmissionen mellan sex aktiemarknader i världen ser ut och att utreda vilka aktiemarknader som har mest inflytande över den svenska börsens volatilitet. Uppsatsen syftar även till att utforska om graden av volatilitetsspridning ökat sedan IT-kraschen. Vid utförandet av denna studie används en ekonometrisk tvåstegsmodell inkluderande GARCH och VAR. Resultaten pekar på att det sprids volatilitet mellan aktiemarknaderna och att utländska innovationer står för en långvarig påverkan på den inhemska volatiliteten. Undersökningen visar att svenska börsen är den aktiemarknad som påverkas mest av utländska chocker och att den inte har någon nämnvärd påverkan på de andra aktiemarknaderna. Vidare påvisar resultaten att IT-kraschen lett till att utländska innovationer fått en större betydelse i Sverige liksom att den svenska börsens volatilitet blivit mer beroende av Nasdaqs.
153

Oljepris och Makroekonomi- en VAR analys av oljeprisets inverkan på aktiemarknaden

Fredriksson, Robert January 2008 (has links)
Oljeprisets påverkan på svensk ekonomi är högaktuell och skapar rubriker i massmedia dagligen. Inte minst på aktiemarknaden iakttas oljepriset noggrant. Denna uppsats undersöker det dynamiska sambandet mellan oljepriset och aktiemarknaden i Sverige, vilket görs genom en VAR analys som baseras på månadsdata för åren 1990-2006. Vidare används impuls responser för att se hur de olika variablerna påverkas av en chock i oljepriset. De data som används i undersökningen är oljeprisindex, aktieprisindex, industriproduktionsindex och ränta. Resultaten som fås är i paritet med tidigare studier och visar att det inte går att finna några starkt signifikanta samband mellan oljeprisutvecklingen och aktiemarknaden i Sverige. Detta kan tyda på att priset generellt är överskattat av både investerare och massmedia.
154

Economic Policy Effect in Quterly-dependence VAR model: Empirical Analysis of Taiwanese cases

Liu, Chun-I 30 June 2010 (has links)
abtract This paper uses a seasonal dependence VAR model that is proposed by Olivei and Tenreyro in the year 2007.We are to assess whether the effect of a policy exogenous shock differs according to the quarter in which the shock occur. We consider Taiwan as a small open economy with flourishing international trade; the effect of exchange rate is viewed as an important transmission channel in monetary transmission mechanism. First part, we consider domestic monetary policy shock how to influence macroeconomic variables. Second part, the United State is powerful around the world. The Fed policies whether affect Taiwan macroeconomic or not. Finally, discuss an exogenous shock on the exchange rate to impact Taiwan macroeconomic.
155

The Impact of The Monetary Polciy in Taiwan-A FAVAR Model Approach

Chu, I-Ching 19 July 2011 (has links)
This paper applies a Factor-Augmented VAR model proposed by Bernanke, Boivin and Eliasz (2005) to measure the impact of the monetary policy in Taiwan. Our empirical results show that, first, the more the factors added in the benchmark VAR, the more we can explain the price puzzle problem. Second, the effect of the tightening in the monetary policy (the increase in the interbank overnight lending rate) is inconsistent with the results expected by the credit channel.
156

The application of Multifactor model and VaR model in predicting market meltdown

Ni, Hao-Yu 21 June 2012 (has links)
With the progress of the times, the international financial market link is becoming more and more closely, while the probability of extreme events more and more high, if there are some indicators can be used as a prediction of the crash, as whether to sell the stocks, it can be very useful. The study process for the use of the Fama-French five-factor model, as well as the VaR model, with the cluster analysis method, and clustering for Taiwan 50 constituent stocks in accordance with the five-factor characteristics of the individual stocks, the similar nature of stock into the same group, the establishment of portfolio, the use of portfolio daily returns to calculate the the VaR, and observe the VaR spread before the crash, how the trend, and whether certain characteristics. Comparison of the cluster group for the predictive ability of the collapse events, as well as the relationship between risk factors and predictive ability. The results of VaR spread movements are often subject to fluctuations significantly change the situation before the crash occurs. By intense will be stable or from stable will be severe. Good predictive ability of the cluster, often its constituent stocks and the collapse of the reasons more closely the relationship. Financial stocks sensitive to the financial tsunami; Electronic stocks are subject to exchange rate affect.Overall, the group with the best predictive ability is more sensitive to momentum effects and investor sentiment indicators ,but non-sensitive to book-to-market factor.To use the Var spread as a predictor of reference,choosing to meet the aforementioned conditions of stocks to the portfolio is a nice way.
157

Predicting Stock Market Crises by VAR Model

Yang, Han-Chih 23 June 2012 (has links)
There are several methods to predict financial crises. There are also several types of indicators used by financial institutions. These indicators, which are estimated in different ways, often show various developments, although it is not possible to directly assess which is the most suitable. Here, we still try to find what characteristics that industry group has and forecast financial crises In this paper, our data started from monthly of 1977 January to 2008 December in S&P100. We consider Fama-French and Cluster Analysis to process data to make data with same characteristic within a group. Then, we use GARCH type models and apply it to VaR predicting stock turmoil. In conclusion, we found that the group which has high kurtosis value is the key factor for predicting stock crises instead of volatility. Moreover, the characteristics of this industry which can predict stock crises is a great scale. On the other hand, we can through this model to double check the reaction for anticipating. Therefore, people can do some actions to control risk to reduce the loss.
158

Sources of Real Exchange Rate Fluctuations -Regional Analysis

Hsieh, Meng-chi 26 July 2005 (has links)
Because of economic globalization and prosperous growing international trade, the problem of international currency exchange derived from these situations becomes more serious. The exchange rate is the index for measuring the currency changing rate internationally, and the changing of exchange rate regime from fixed to floating will cause the volatility of exchange rate fluctuation. For Taiwan, a small open economy, and its exporting intensive policy, it is more difficult to avoid this impact. Therefore, it is meaningful to study the fluctuating of exchange rate. The study compares the sources of real exchange rate fluctuations between Taiwan and North America, Europe and Asia in the long run over the period 1981:1 to 2003:4. The theoretical model of Clarida and Gali (1994) is used to observe related output, real effective exchange rate, and domestic money supply which are variables of this study. In empirical, the unit root is used to confirm that the unit root is exist and through the cointegration test to make sure that there is no relation of cointergration. And then, make use of the way provided by Blanchard and Quah (1989), using the long run restriction to construct the structural VAR model, and impulse response function and variance decomposition is derived to analyze the problem. Through the empirical result, we can find that when Taiwan compare to North America and Europe, the source of long run real exchange rate fluctuation comes from demand shock, and this result is the same as Lastrapes (1992), Clarida and Gali (1994) and Chen and Wu (1997). For countries in Asia, which are developing countries mainly, the source of long run real exchange rate fluctuation comes from supply shock, and it explains the importance of effect of output .Besides, the long term monetary neutrality come into existence in each region, empirically.
159

Applying RAROC, Value-at-Risk and Extreme Value Theory to Performance Measurement of Financial Holding Companies.

Chou, Cheng-Yi 07 July 2006 (has links)
none
160

A Study on SPAN's Risk-measuring Methodology For Portfolio That Include Options

Hung, Ching-Hwa 27 June 2000 (has links)
None

Page generated in 0.0214 seconds