• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 7
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 25
  • 25
  • 25
  • 11
  • 11
  • 11
  • 9
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Essays in financial intermediation, monetary policy, and macroeconomic activity

Dressler, Scott James 28 August 2008 (has links)
Not available / text
12

Macrodinâmica à Keynesiana = uma travessia com consistência entre fluxos e estoques a partir do encadeamento de curtos períodos do multiplicador / Keynesian-style macrodinamics : a stock-flow consistent traverse throught the enchainment of short multiplier periods

Leite, Fabricio Pitombo, 1980- 07 August 2008 (has links)
Orientador: Antonio Carlos Macedo e Silva / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Economia / Made available in DSpace on 2018-08-20T06:16:51Z (GMT). No. of bitstreams: 1 Leite_FabricioPitombo_D.pdf: 2446150 bytes, checksum: 480758239821863520f38091dcf2d5f8 (MD5) Previous issue date: 2012 / Resumo: Essa tese esta dividida em três partes. Na primeira, discute-se o curto período subjacente ao multiplicador tradicional dos gastos autonomos, de modo que estejam explicitados os grandes agregados macroeconômicos e as conexoes entre os mesmos, para que a separação entre as parcelas autonoma e induzida com relacao a renda e crucial. Em um segundo capitulo, integrante dessa primeira parte, procede-se a estimativas dos parâmetros envolvidos em uma dada especificação do multiplicador, para o Brasil, nas quais tenta-se ainda captar o período de tempo (cronológico) inerente ao período (teórico) do multiplicador. Na segunda parte, realiza-se a tarefa de justificar o tratamento dado ao investimento, considerado autônomo com relacao a renda, lançando Mao de uma explicação a partir das relações intersetoriais existentes num sistema econômico. Mais uma vez, efetua-se a divisão em dois capítulos, um explicitando a base teórica utilizada, a partir de um esquema multissetorial, e outro apresentando resultados empíricos, baseados em matrizes insumo-produto. Finalmente, na terceira parte, empregando modelos de consistência entre fluxos e estoques, são derivadas algumas implicações dinâmicas para alem do curto período do multiplicador, decorrentes de uma dada estrutura observada. Para tal, as estimativas dos parâmetros e do próprio período do multiplicador são utilizadas, passando-se de um modelo teórico de simulação a uma estratégia aplicada que incorpora o papel dos estoques em um arcabouço trivial de analise dos agregados macroeconômicos / Abstract: This thesis is divided into three parts. The first one discusses the short period behind the traditional autonomous expenditure multiplier, in a way that exposes the major macroeconomic aggregates and their interconnections; at this point the separation between the autonomous and induced parcels with respect to income is crucial. A second chapter, compounding the first part, presents estimations of the parameters involved in a given specification of the multiplier, for Brazil, trying also to capture the chronological period of time inherent to the theoretical period of the multiplier. Considering the presentation of the investment in the previous part as autonomous with respect to income, the second part explores this hypothesis arguing from the inter-sectoral relationships existent in a economic system. Once more, the division in two chapters was made, one outlining the theoretical basis, from a multi-sectoral scheme, and another featuring empirical results, based on input-output matrices. Finally, the third part employs stock-flow consistent models to derive some dynamical implications to beyond the short period of multiplier, resulting from a given observed structure. To this end, the estimations of the parameters and of the multiplier period are used, moving from a theoretical simulation model to an applied strategy that incorporates the role of stocks in a trivial framework of analysis of the macroeconomic aggregates / Doutorado / Teoria Economica / Doutor em Ciências Econômicas
13

A no-arbitrage macro finance approach to the term structure of interest rates

Thafeni, Phumza 03 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2014. / ENGLISH ABSTRACT: This work analysis the main macro-finance models of the term structure of interest rates that determines the joint dynamics of the term structure and the macroeconomic fundamentals under no-arbitrage approach. There has been a long search during the past decades of trying to study the relationship between the term structure of interest rates and the economy, to the extent that much of recent research has combined elements of finance, monetary economics, and the macroeconomics to analyse the term structure. The central interest of the thesis is based on two important notions. Firstly, it is picking up from the important work of Ang and Piazzesi (2003) model who suggested a joint macro- finance strategy in a discrete time affine setting, by also imposing the classical Taylor (1993) rule to determine the association between yields and macroeconomic variables through monetary policy. There is a strong intuition from the Taylor rule literature that suggests that such macroeconomic variables as in inflation and real activity should matter for the interest rate, which is the monetary policy instrument. Since from this important framework, no-arbitrage macro-finance approach to the term structure of interest rates has become an active field of cross-disciplinary research between financial economics and macroeconomics. Secondly, the importance of forecasting the yield curve using the variations on the Nelson and Siegel (1987) exponential components framework to capture the dynamics of the entire yield curve into three dimensional parameters evolving dynamically. Nelson-Siegel approach is a convenient and parsimonious approximation method which has been trusted to work best for fitting and forecasting the yield curve. The work that has caught quite much of interest under this framework is the generalized arbitrage-free Nelson-Siegel macro- nance term structure model with macroeconomic fundamentals, (Li et al. (2012)), that characterises the joint dynamic interaction between yields and the macroeconomy and the dynamic relationship between bond risk-premia and the economy. According to Li et al. (2012), risk-premia is found to be closely linked to macroeconomic activities and its variations can be analysed. The approach improves the estimation and the challenges on identication of risk parameters that has been faced in recent macro-finance literature. / AFRIKAANSE OPSOMMING: Hierdie werk ontleed die makro- nansiese modelle van die term struktuur van rentekoers pryse wat die gesamentlike dinamika bepaal van die term struktuur en die makroekonomiese fundamentele faktore in 'n geen arbitrage wêreld. Daar was 'n lang gesoek in afgelope dekades gewees wat probeer om die verhouding tussen die term struktuur van rentekoerse en die ekonomie te bestudeer, tot die gevolg dat baie onlangse navorsing elemente van nansies, monetêre ekonomie en die makroekonomie gekombineer het om die term struktuur te analiseer. Die sentrale belang van hierdie proefskrif is gebaseer op twee belangrike begrippe. Eerstens, dit tel op by die belangrike werk van die Ang and Piazzesi (2003) model wat 'n gesamentlike makro- nansiering strategie voorstel in 'n diskrete tyd a ene ligging, deur ook die klassieke Taylor (1993) reël om assosiasie te bepaal tussen opbrengste en makroekonomiese veranderlikes deur middel van monetêre beleid te imposeer. Daar is 'n sterk aanvoeling van die Taylor reël literatuur wat daarop dui dat sodanige makroekonomiese veranderlikes soos in asie en die werklike aktiwiteit moet saak maak vir die rentekoers, wat die monetêre beleid instrument is. Sedert hierdie belangrike raamwerk, het geen-arbitrage makro- nansies benadering tot term struktuur van rentekoerse 'n aktiewe gebied van kruis-dissiplinêre navorsing tussen nansiële ekonomie en makroekonomie geword. Tweedens, die belangrikheid van voorspelling van opbrengskromme met behulp van variasies op die Nelson and Siegel (1987) eksponensiële komponente raamwerk om dinamika van die hele opbrengskromme te vang in drie dimensionele parameters wat dinamies ontwikkel. Die Nelson-Siegel benadering is 'n gerie ike en spaarsamige benaderingsmetode wat reeds vertrou word om die beste pas te bewerkstellig en voorspelling van die opbrengskromme. Die werk wat nogal baie belangstelling ontvang het onder hierdie raamwerk is die algemene arbitrage-vrye Nelson-Siegel makro- nansiele term struktuur model met makroekonomiese grondbeginsels, (Li et al. (2012)), wat kenmerkend van die gesamentlike dinamiese interaksie tussen die opbrengs en die makroekonomie en die dinamiese verhouding tussen band risiko-premies en die ekonomie is. Volgens Li et al. (2012), word risiko-premies bevind om nou gekoppel te wees aan makroekonomiese aktiwiteite en wat se variasies ontleed kan word. Die benadering verbeter die skatting en die uitdagings van identi- sering van risiko parameters wat teegekom is in die afgelope makro- nansiese literatuur.
14

Combining structural and reduced-form models for macroeconomic forecasting and policy analysis

Monti, Francesca 08 February 2011 (has links)
Can we fruitfully use the same macroeconomic model to forecast and to perform policy analysis? There is a tension between a model’s ability to forecast accurately and its ability to tell a theoretically consistent story. The aim of this dissertation is to propose ways to soothe this tension, combining structural and reduced-form models in order to have models that can effectively do both. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
15

A study of the existence of equilibrium in mathematical economics

Xotyeni, Zukisa Gqabi January 2008 (has links)
In this thesis we define and study the existence of an equilibrium situation in which producers maximize their profits relative to the production vectors in their production sets, consumers satisfy their preferences in their consumption sets under certain budget constraint, and for every commodity total demand equals total supply. This competitive equilibrium situation is referred to as the Walrasian equilibrium. The existence of this equilibrium is investigated from a various mathematical points of view. These include microeconomic theory, simplicial spaces, global analysis and lattice theory.
16

Un modèle trimestriel de l'économie belge

Ginsburgh, Victor January 1971 (has links)
Doctorat en sciences sociales, politiques et économiques / info:eu-repo/semantics/nonPublished
17

Essays on the econometrics of macroeconomic survey data

Conflitti, Cristina 11 September 2012 (has links)
This thesis contains three essays covering different topics in the field of statistics<p>and econometrics of survey data. Chapters one and two analyse two aspects<p>of the Survey of Professional Forecasters (SPF hereafter) dataset. This survey<p>provides a large information on macroeconomic expectations done by the professional<p>forecasters and offers an opportunity to exploit a rich information set.<p>But it poses a challenge on how to extract the relevant information in a proper<p>way. The last chapter addresses the issue of analyzing the opinions on the euro<p>reported in the Flash Eurobaromenter dataset.<p>The first chapter Measuring Uncertainty and Disagreement in the European<p>Survey of Professional Forecasters proposes a density forecast methodology based<p>on the piecewise linear approximation of the individual’s forecasting histograms,<p>to measure uncertainty and disagreement of the professional forecasters. Since<p>1960 with the introduction of the SPF in the US, it has been clear that they were a<p>useful source of information to address the issue on how to measure disagreement<p>and uncertainty, without relying on macroeconomic or time series models. Direct<p>measures of uncertainty are seldom available, whereas many surveys report point<p>forecasts from a number of individual respondents. There has been a long tradition<p>of using measures of the dispersion of individual respondents’ point forecasts<p>(disagreement or consensus) as proxies for uncertainty. Unlike other surveys, the<p>SPF represents an exception. It directly asks for the point forecast, and for the<p>probability distribution, in the form of histogram, associated with the macro variables<p>of interest. An important issue that should be considered concerns how to<p>approximate individual probability densities and get accurate individual results<p>for disagreement and uncertainty before computing the aggregate measures. In<p>contrast to Zarnowitz and Lambros (1987), and Giordani and Soderlind (2003) we<p>overcome the problem associated with distributional assumptions of probability<p>density forecasts by using a non parametric approach that, instead of assuming<p>a functional form for the individual probability law, approximates the histogram<p>by a piecewise linear function. In addition, and unlike earlier works that focus on<p>US data, we employ European data, considering gross domestic product (GDP),<p>inflation and unemployment.<p>The second chapter Optimal Combination of Survey Forecasts is based on<p>a joint work with Christine De Mol and Domenico Giannone. It proposes an<p>approach to optimally combine survey forecasts, exploiting the whole covariance<p>structure among forecasters. There is a vast literature on forecast combination<p>methods, advocating their usefulness both from the theoretical and empirical<p>points of view (see e.g. the recent review by Timmermann (2006)). Surprisingly,<p>it appears that simple methods tend to outperform more sophisticated ones, as<p>shown for example by Genre et al. (2010) on the combination of the forecasts in<p>the SPF conducted by the European Central Bank (ECB). The main conclusion of<p>several studies is that the simple equal-weighted average constitutes a benchmark<p>that is hard to improve upon. In contrast to a great part of the literature which<p>does not exploit the correlation among forecasters, we take into account the full<p>covariance structure and we determine the optimal weights for the combination<p>of point forecasts as the minimizers of the mean squared forecast error (MSFE),<p>under the constraint that these weights are nonnegative and sum to one. We<p>compare our combination scheme with other methodologies in terms of forecasting<p>performance. Results show that the proposed optimal combination scheme is an<p>appropriate methodology to combine survey forecasts.<p>The literature on point forecast combination has been widely developed, however<p>there are fewer studies analyzing the issue for combination density forecast.<p>We extend our work considering the density forecasts combination. Moving from<p>the main results presented in Hall and Mitchell (2007), we propose an iterative<p>algorithm for computing the density weights which maximize the average logarithmic<p>score over the sample period. The empirical application is made for the<p>European GDP and inflation forecasts. Results suggest that optimal weights,<p>obtained via an iterative algorithm outperform the equal-weighted used by the<p>ECB density combinations.<p>The third chapter entitled Opinion surveys on the euro: a multilevel multinomial<p>logistic analysis outlines the multilevel aspects related to public attitudes<p>toward the euro. This work was motivated by the on-going debate whether the<p>perception of the euro among European citizenships after ten years from its introduction<p>was positive or negative. The aim of this work is, therefore, to disentangle<p>the issue of public attitudes considering either individual socio-demographic characteristics<p>and macroeconomic features of each country, counting each of them<p>as two separate levels in a single analysis. Considering a hierarchical structure<p>represents an advantage as it models within-country as well as between-country<p>relations using a single analysis. The multilevel analysis allows the consideration<p>of the existence of dependence between individuals within countries induced by<p>unobserved heterogeneity between countries, i.e. we include in the estimation<p>specific country characteristics not directly observable. In this chapter we empirically<p>investigate which individual characteristics and country specificities are<p>most important and affect the perception of the euro. The attitudes toward the<p>euro vary across individuals and countries, and are driven by personal considerations<p>based on the benefits and costs of using the single currency. Individual<p>features, such as a high level of education or living in a metropolitan area, have<p>a positive impact on the perception of the euro. Moreover, the country-specific<p>economic condition can influence individuals attitudes. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
18

Essays in real-time forecasting

Liebermann, Joëlle 12 September 2012 (has links)
This thesis contains three essays in the field of real-time econometrics, and more particularly<p>forecasting.<p>The issue of using data as available in real-time to forecasters, policymakers or financial<p>markets is an important one which has only recently been taken on board in the empirical<p>literature. Data available and used in real-time are preliminary and differ from ex-post<p>revised data, and given that data revisions may be quite substantial, the use of latest<p>available instead of real-time can substantially affect empirical findings (see, among others,<p>Croushore’s (2011) survey). Furthermore, as variables are released on different dates<p>and with varying degrees of publication lags, in order not to disregard timely information,<p>datasets are characterized by the so-called “ragged-edge”structure problem. Hence, special<p>econometric frameworks, such as developed by Giannone, Reichlin and Small (2008) must<p>be used.<p>The first Chapter, “The impact of macroeconomic news on bond yields: (in)stabilities over<p>time and relative importance”, studies the reaction of U.S. Treasury bond yields to real-time<p>market-based news in the daily flow of macroeconomic releases which provide most of the<p>relevant information on their fundamentals, i.e. the state of the economy and inflation. We<p>find that yields react systematically to a set of news consisting of the soft data, which have<p>very short publication lags, and the most timely hard data, with the employment report<p>being the most important release. However, sub-samples evidence reveals that parameter<p>instability in terms of absolute and relative size of yields response to news, as well as<p>significance, is present. Especially, the often cited dominance to markets of the employment<p>report has been evolving over time, as the size of the yields reaction to it was steadily<p>increasing. Moreover, over the recent crisis period there has been an overall switch in the<p>relative importance of soft and hard data compared to the pre-crisis period, with the latter<p>becoming more important even if less timely, and the scope of hard data to which markets<p>react has increased and is more balanced as less concentrated on the employment report.<p>Markets have become more reactive to news over the recent crisis period, particularly to<p>hard data. This is a consequence of the fact that in periods of high uncertainty (bad state),<p>markets starve for information and attach a higher value to the marginal information content<p>of these news releases.<p>The second and third Chapters focus on the real-time ability of models to now-and-forecast<p>in a data-rich environment. It uses an econometric framework, that can deal with large<p>panels that have a “ragged-edge”structure, and to evaluate the models in real-time, we<p>constructed a database of vintages for US variables reproducing the exact information that<p>was available to a real-time forecaster.<p>The second Chapter, “Real-time nowcasting of GDP: a factor model versus professional<p>forecasters”, performs a fully real-time nowcasting (forecasting) exercise of US real GDP<p>growth using Giannone, Reichlin and Smalls (2008), henceforth (GRS), dynamic factor<p>model (DFM) framework which enables to handle large unbalanced datasets as available<p>in real-time. We track the daily evolution throughout the current and next quarter of the<p>model nowcasting performance. Similarly to GRS’s pseudo real-time results, we find that<p>the precision of the nowcasts increases with information releases. Moreover, the Survey of<p>Professional Forecasters does not carry additional information with respect to the model,<p>suggesting that the often cited superiority of the former, attributable to judgment, is weak<p>over our sample. As one moves forward along the real-time data flow, the continuous<p>updating of the model provides a more precise estimate of current quarter GDP growth and<p>the Survey of Professional Forecasters becomes stale. These results are robust to the recent<p>recession period.<p>The last Chapter, “Real-time forecasting in a data-rich environment”, evaluates the ability<p>of different models, to forecast key real and nominal U.S. monthly macroeconomic variables<p>in a data-rich environment and from the perspective of a real-time forecaster. Among<p>the approaches used to forecast in a data-rich environment, we use pooling of bi-variate<p>forecasts which is an indirect way to exploit large cross-section and the directly pooling of<p>information using a high-dimensional model (DFM and Bayesian VAR). Furthermore forecasts<p>combination schemes are used, to overcome the choice of model specification faced by<p>the practitioner (e.g. which criteria to use to select the parametrization of the model), as<p>we seek for evidence regarding the performance of a model that is robust across specifications/<p>combination schemes. Our findings show that predictability of the real variables is<p>confined over the recent recession/crisis period. This in line with the findings of D’Agostino<p>and Giannone (2012) over an earlier period, that gains in relative performance of models<p>using large datasets over univariate models are driven by downturn periods which are characterized<p>by higher comovements. These results are robust to the combination schemes<p>or models used. A point worth mentioning is that for nowcasting GDP exploiting crosssectional<p>information along the real-time data flow also helps over the end of the great moderation period. Since this is a quarterly aggregate proxying the state of the economy,<p>monthly variables carry information content for GDP. But similarly to the findings for the<p>monthly variables, predictability, as measured by the gains relative to the naive random<p>walk model, is higher during crisis/recession period than during tranquil times. Regarding<p>inflation, results are stable across time, but predictability is mainly found at nowcasting<p>and forecasting one-month ahead, with the BVAR standing out at nowcasting. The results<p>show that the forecasting gains at these short horizons stem mainly from exploiting timely<p>information. The results also show that direct pooling of information using a high dimensional<p>model (DFM or BVAR) which takes into account the cross-correlation between the<p>variables and efficiently deals with the “ragged-edge”structure of the dataset, yields more<p>accurate forecasts than the indirect pooling of bi-variate forecasts/models. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
19

Structural models for macroeconomics and forecasting

De Antonio Liedo, David 03 May 2010 (has links)
This Thesis is composed by three independent papers that investigate<p>central debates in empirical macroeconomic modeling.<p><p>Chapter 1, entitled “A Model for Real-Time Data Assessment with an Application to GDP Growth Rates”, provides a model for the data<p>revisions of macroeconomic variables that distinguishes between rational expectation updates and noise corrections. Thus, the model encompasses the two polar views regarding the publication process of statistical agencies: noise versus news. Most of the studies previous studies that analyze data revisions are based<p>on the classical noise and news regression approach introduced by Mankiew, Runkle and Shapiro (1984). The problem is that the statistical tests available do not formulate both extreme hypotheses as collectively exhaustive, as recognized by Aruoba (2008). That is, it would be possible to reject or accept both of them simultaneously. In turn, the model for the<p>DPP presented here allows for the simultaneous presence of both noise and news. While the “regression approach” followed by Faust et al. (2005), along the lines of Mankiew et al. (1984), identifies noise in the preliminary<p>figures, it is not possible for them to quantify it, as done by our model. <p><p>The second and third chapters acknowledge the possibility that macroeconomic data is measured with errors, but the approach followed to model the missmeasurement is extremely stylized and does not capture the complexity of the revision process that we describe in the first chapter.<p><p><p>Chapter 2, entitled “Revisiting the Success of the RBC model”, proposes the use of dynamic factor models as an alternative to the VAR based tools for the empirical validation of dynamic stochastic general equilibrium (DSGE) theories. Along the lines of Giannone et al. (2006), we use the state-space parameterisation of the factor models proposed by Forni et al. (2007) as a competitive benchmark that is able to capture weak statistical restrictions that DSGE models impose on the data. Our empirical illustration compares the out-of-sample forecasting performance of a simple RBC model augmented with a serially correlated noise component against several specifications belonging to classes of dynamic factor and VAR models. Although the performance of the RBC model is comparable<p>to that of the reduced form models, a formal test of predictive accuracy reveals that the weak restrictions are more useful at forecasting than the strong behavioral assumptions imposed by the microfoundations in the model economy.<p><p>The last chapter, “What are Shocks Capturing in DSGE modeling”, contributes to current debates on the use and interpretation of larger DSGE<p>models. Recent tendency in academic work and at central banks is to develop and estimate large DSGE models for policy analysis and forecasting. These models typically have many shocks (e.g. Smets and Wouters, 2003 and Adolfson, Laseen, Linde and Villani, 2005). On the other hand, empirical studies point out that few large shocks are sufficient to capture the covariance structure of macro data (Giannone, Reichlin and<p>Sala, 2005, Uhlig, 2004). In this Chapter, we propose to reconcile both views by considering an alternative DSGE estimation approach which<p>models explicitly the statistical agency along the lines of Sargent (1989). This enables us to distinguish whether the exogenous shocks in DSGE<p>modeling are structural or instead serve the purpose of fitting the data in presence of misspecification and measurement problems. When applied to the original Smets and Wouters (2007) model, we find that the explanatory power of the structural shocks decreases at high frequencies. This allows us to back out a smoother measure of the natural output gap than that<p>resulting from the original specification. / Doctorat en Sciences économiques et de gestion / info:eu-repo/semantics/nonPublished
20

Etude économétrique des effets sur l'économie belge de l'introduction de la T.V.A. et de l'harmonisation des taux au sein du Marché commun

Guillaume, Y. January 1971 (has links)
Doctorat en sciences sociales, politiques et économiques / info:eu-repo/semantics/nonPublished

Page generated in 0.4448 seconds