• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 16
  • 10
  • 4
  • 2
  • 2
  • 2
  • Tagged with
  • 42
  • 42
  • 34
  • 33
  • 19
  • 16
  • 15
  • 12
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Four essays in dynamic macroeconomics

Sun, Qi January 2010 (has links)
The dissertation contains essays concerning the linkages between macroeconomy and financial market or the conduct of monetary policy via DSGE modelling. The dissertation contributes to the questions of fitting macroeconomic models to the data, and so contributes to our understanding of the driving forces of fluctuations in macroeconomic and financial variables. Chapter one offers an introduction to my thesis and outlines in detail the main results and methodologies. In Chapter two I introduce a statistical measure for model evaluation and selection based on the full information of sample second moments in data. A model is said to outperform its counterpart if it produces closer similarity in simulated data variance-covariance matrix when compared with the actual data. The "distance method" is generally feasible and simple to conduct. A flexible price two-sector open economy model is studied to match the observed puzzles of international finance data. The statistical distance approach favours a model with dominant role played by the expectational errors in foreign exchange market which breaks the international interest rate parity. Chapter three applies the distance approach to a New Keynesian model augmented with habit formation and backward-looking component of pricing behaviour. A macro-finance model of yield curve is developed to showcase the dynamics of implied forward yields. This exercise, with the distance approach, reiterate the inability of macro model in explaining yield curve dynamics. The method also reveals remarkable interconnection between real quantity and bond yield slope. In Chapter four I study a general equilibrium business cycle model with sticky prices and labour market rigidities. With costly matching on labour market, output responds in a hump-shaped and persistent manner to monetary shocks and the resulting Phillips curve seems to radically change the scope for monetary policy because (i) there are speed limit effects for policy and (ii) there is a cost channel for monetary policy. Labour reforms such as in mid-1980s UK can trigger more effective monetary policy. Research on monetary policy shall pay greater attention to output when labour market adjustments are persistent. Chapter five analyzes the link between money and financial spread, which is oft missed in specification of monetary policy making analysis. When liquidity provision by banks dominates the demand for money from the real economy, money may contain information of future output and inflation due to its impact on financial spreads. I use a sign-restriction Bayesian VAR estimation to separate the liquidity provision impact from money market equilibrium. The decomposition exercise shows supply shocks dominate the money-price nexus in the short to medium term. It also uncovers distinctive policy stance of two central banks. Finally Chapter six concludes, providing a brief summary of the research work as well as a discussion of potential limitations and possible directions for future research.
22

Implementation of taylor type rules in nascent money and capital markets under managed exchange rates

Birchwood, Anthony January 2011 (has links)
We investigate the practical use of Taylor-type rules in Trinidad and Tobago, which is in the process of implementing market based monetary policy and seeks to implement flexible inflation targeting in the presence of a managed exchange rate. This is motivated by the idea that normative Taylor rules can be shaped by the practical experience of developing countries. We find that the inflation – exchange rate nexus is strong, hence the country may be unwilling to allow the exchange rate to float freely. We contend that despite weak market development the Taylor rule can still be applied as the central bank is able to use moral suasion to achieve full pass through of the policy rate to the market rate. Our evidence rejects Galí and Monacelli’s (2005) argument that the optimal monetary policy rule for the open economy is isomorphic for a closed economy. Rather, our evidence suggests that the rule for the open economy allows for lower variability when the rule is augmented by the real exchange rate as in Taylor (2001). We also reject Galí and Monacelli’s (2005) hypothesis that domestic inflation is optimal for inclusion in the Taylor-type rule. Instead we find that core CPI inflation leads to lower variability. Additionally, our evidence suggests that the monetary rule, when applied to Trinidad and Tobago, is accommodating to the US Federal Reserve rate. Further, we expand the work of Martin and Milas (2010) which considered the pass through of the policy rate to the interbank rate in the presence of risk and liquidity. By extending the transmission to the market lending rate, we are able to go beyond those disruptive factors by considering excess liquidity and spillovers of international economic disturbances. We found that these shocks are significant for Trinidad and Tobago, but it is not significant enough to disrupt the pass through. As a result, full pass through was robust to the presence of these disruptive factors.
23

Vliv přijetí eura v České republice / Impacts of the Euro Adoption in the Czech Republic

Svačina, David January 2015 (has links)
DSGE models are as structural models capable of estimating what would have happened if some part of economy or shocks to it had been different. We consider three such differences in the recent Czech history: no financial shocks during the crisis in 2008-2009; eurozone membership during the crisis in 2008-2009; and no foreign exchange interventions of the Czech National Bank in November 2013. For this purpose, we employ a small open economy DSGE model with financial frictions and estimate it with Bayesian inference. Our results show that impact of financial shocks on GDP growth was negligible. Further, eurozone membership would have made crisis more severe; GDP growth in 2009Q1 would have been -6% instead of -3% and economy would have been in deflation for the five consecutive periods. Difference is explained by strong depreciation of exchange rate during crisis that would not have occurred with the fixed exchange rate. Lastly, the Czech National Banks's foreign exchange interventions increased GDP growth by as much as 0.8 percentage point and saved economy from deflation in all following quarters. They worked through depreciation of exchange rate and consequent improvement in trade balance and increase in price of imported goods. Powered by TCPDF (www.tcpdf.org)
24

Essays in macroeconomics /

Trabandt, Mathias. January 2007 (has links) (PDF)
Humboldt-Univ., Diss (Nicht für den Austausch)--Berlin, 2007.
25

貨幣政策對貧富不均度之影響 : 以臺灣為例 / The effect of monetary policy on income inequality: the case of Taiwan

范文俞, Fan, Wen Yu Unknown Date (has links)
本篇文章的目的為,在一個含有銀行信用管道以及兩種家計單位設定的動態隨機一般均衡模型的架構之中,探討中央銀行實施貨幣政策以及技術面衝擊對於貧富不均度的影響。本篇文章主要依據Kumhof, Rancière and Winant (2015)對於兩種家計單位之設定,參考Benigo and Eggertsson (2016)加入銀行信用管道,並參考Lansing and Markiewicz (2016)將勞動內生化以及刻劃收入來源之不同,因而建構出一個封閉經濟體系,內含兩種不同的家計單位、商品生產部門、銀行信用管道、政府之課稅政策以及中央銀行之貨幣政策。本文發現,貧富不均度在面對中央銀行實施緊縮性貨幣政策以及技術面正向衝擊時會暫時性地擴大,在長期時會回到初始的靜態均衡值。 / The main purpose of this paper is to discuss the effect of monetary policy on income inequality using a micro-based dynamic stochastic general equilibrium model with credit channel and two groups of households. Following Kumhof, Rancière and Winant (2015), households can be divided into two groups; moreover, we follow Benigo and Eggertsson (2016) to add the credit channel, and follow Lansing and Markiewicz (2016) to make households supply labor endogenously and characterize the difference of income source. Therefore, we build up a closed economy model with two groups of households, goods firms, credit channel, the taxation policy implemented by government and monetary policy implemented by central bank. We find that contractionary monetary policy and positive technology shock will temporarily generate a worse income inequality. In the long term, the economy will be back to the initial steady state.
26

COMPETING CURRENCIES AS AN ALTERNATIVE SCENARIO TO LEGAL TENDER CLAUSE: MATHEMATICAL PROOF / Competing currencies as an alternative scenario to legal tender clause: Mathematical proof

Gawthorpe, Kateřina January 2013 (has links)
Previous literature examining the scenario without the constraint of legal tender law is a rather theoretical analysis of the subject matter. Aside from the theoretical examination of the competition of money this paper offers dynamic structural macroeconomic model based on the money in the utility function. This model compares the current monetary conditions with the potential situation permitting more currencies circulating alongside. The main assumption about individuals' preferences over stable currencies underlines the whole paper with emphasis on the mathematical model. The uniqueness of this model lies in the incorporation of variables affecting respective money demand functions into the utility function of the DSGE model and in the purpose of its use as well as its variables, where representative agent is a household owning a bank rather than a firm. Overall the results of this paper favor the idea of exclusion of the legal tender law in a developed country without severe turmoil. Particularly, the ascent of competition among currencies leads to lower inflation than present scenario. However, final simulations of the model in Matlab supplements such so far "unambiguous" view with skepticism due to possible difficulties during discovery process in such scenario.
27

[en] ASSESSMENT OF A DERIVATIVE MANAGEMENT POLICY FOR RISK-AVERSE CORPORATIONS: A STOCHASTIC DYNAMIC PROGRAMMING APPROACH / [pt] AVALIAÇÃO DE UMA POLÍTICA DE GESTÃO DE DERIVATIVOS EM EMPRESAS AVESSAS A RISCO: UMA ABORDAGEM DE PROGRAMAÇÃO DINÂMICA ESTOCÁSTICA

RODRIGO FERREIRA INOCENCIO SILVA 16 June 2020 (has links)
[pt] Finanças corporativas compreendem políticas de investimento, financiamento e dividendo cujo objetivo é maximizar o valor do acionista. Em particular, os resultados de empresas produtoras de commodities e, consequentemente, o valor para seus acionistas estão sujeitos a alta volatilidade, decorrentes da variação dos preços destes produtos no mercado global. Entretanto, o risco dessa variação pode ser mitigado ao se explorar o amplo mercado de derivativos que, em geral, está disponível para commodities. Este trabalho propõe calcular o acréscimo de valor que uma empresa produtora de commodities pode fornecer ao seu acionista pelo uso de uma política ótima de gestão de derivativos, por meio da compra ou venda de contratos a termo. Para tanto, busca maximizar o retorno aos acionistas via dividendos em um ambiente avesso a risco. O modelo assume que o preço da commodity segue um processo de Markov de estados discretos. Como o modelo é aplicado em vários estágios, o problema torna-se bastante complexo, sendo necessário usar um método de decomposição para obter a solução, sendo assim, utilizou-se o método conhecido como programação dual dinâmica estocástica. Os resultados demonstram que, ao comercializar contratos forward, uma empresa aumenta o valor percebido pelo acionista, medido pelo pagamento de dividendos, para qualquer nível de aversão a risco. A média de acréscimo de valor, considerando diferentes níveis de aversão a risco e uma premissa de precificação não viesada, é superior a 320 por cento quando comparado a empresas que não possuem acesso a tais instrumentos. Além de medir o acréscimo de valor, analisou-se também quais os fatores determinantes para a política ótima de gestão de derivativos. Foi possível identificar que a política de gestão de derivativos é muito determinada pelos preços, que por sua vez estão associados ao estado da cadeia de Markov vigente em cada estágio. / [en] Corporate finance comprises investment, financing and dividend policies aimed at maximizing shareholder value. In particular, the results of commodity producers and, consequently, the value to their shareholders are subject to high volatility, resulting from the variation of prices of these products in the global market. However, the risk of this variation can be mitigated by exploiting the broad derivatives market that is generally available for commodities. This work proposes to calculate the value increase that a commodity-producing company can provide to its shareholders through the use of an optimal derivatives management policy, by buying or selling forward contracts. To this end, it seeks to maximize shareholder returns via dividends in a risk-averse environment. The model assumes that the commodity price follows a discrete state Markov process. Since the model is applied in several stages, the problem becomes quite complex, and it is necessary to use a decomposition method to obtain the solution, so we used the method known as stochastic dynamic dual programming. The results show that by trading forward contracts, a company increases the value perceived by the shareholder, measured by the payment of dividends, to any level of risk aversion. The average value increase, considering different levels of risk aversion and an unbiased pricing assumption, is higher than 320 per cent when compared to companies that do not have access to such instruments. In addition to measuring the value increase, we also analyzed which factors determine the optimal derivatives management policy. It was possible to identify that the derivatives management policy is very determined by the prices, which in turn are associated with the state of the Markov chain in force at each stage.
28

Essays on House Prices and Consumption

Song, In Ho 27 July 2011 (has links)
No description available.
29

De la révolution lucasienne aux modèles DSGE : réflexions sur les développements récents de la modélisation macroéconomique / History of recent developments in macroeconomic modeling : from Robert Lucas to dynamic stochastic general equilibrium (DSGE) models

Sergi, Francesco 24 March 2017 (has links)
Ce travail propose une mise en perspective des pratiques de modélisation macroéconomique,depuis les travaux de Robert E. Lucas dans les années 1970 jusqu’aux contributions actuelles de l’approche dite d’équilibre général dynamique stochastique (DSGE). Cette mise en perspective permet de caractériser l’essor des modèles DSGE comme un compromis entre conceptions antagonistes de la modélisation : d’une part, celle de l’approche des cycles réels (RBC) et, d’autre part, celle de la nouvelle économie keynésienne. Pour justifier cette opposition, ce travail propose une reconstruction épistémologique de l’histoire récente de la macroéconomie, à savoir une analyse des différents critères qui définissent la validité et la pertinence d’un modèle. L’hypothèse de travail est qu’on peut identifier, pour chaque pratique de modélisation,trois critères méthodologiques fondamentaux : la validité interne (l’adéquation des hypothèses d’un modèle aux concepts aux formalismes d’une théorie), la validité externe(l’adéquation des hypothèses et/ou des résultats d’un modèle au monde réel, et les procédés quantitatifs pour évaluer cette adéquation) et le critère de hiérarchie (la préférence pour la validité interne sur la validité externe, ou vice versa). Cette grille de lecture, inspirée de la littérature sur les modèles en philosophie des sciences, permet d’apporter quatre contributions originales à l’histoire de la macroéconomie récente. (1) Elle permet de concevoir l’essor des modèles DSGE sans faire appel à l’explication proposée par l’historiographie produite par les macroéconomistes eux-mêmes,à savoir l’existence d’un consensus et d’un progrès technique exogène. Contre cette vision de l’histoire en termes de progrès scientifique, nous mettons en avant les oppositions méthodologiques au sein de la macroéconomie et nous illustrons l’interdépendance entre activité théorique et développement des méthodes statistiques et économétriques. (2) La thèse s’attaque au cloisonnement entre histoire des théories macroéconomiques et histoire des méthodes quantitatives. Grâce à sa perspective méthodologique, ce travail permet d’opérer la jonction entre ces deux littératures et de développer les bases d’une vision globale des transformations récentes de la macroéconomie. (3) La relecture méthodologique de l’histoire de la modélisation permet de mettre en évidence comment la condition de validité externe a représenté le principal point de clivage entre différentes conceptions de la modélisation. La question de la validité externe apparaît par ailleurs intrinsèquement liée à la question de l’explication causale des phénomènes, sur laquelle repose largement la justification de la modélisation comme outil d’expertise des politiques économiques. (4) Ce travail aboutit à une caractérisation originale de l’approche DSGE : loin de constituer une «synthèse» ou un consensus, cette approche s’apparente à un compromis, fragilisé par l’antagonisme méthodologique entre ses parties prenantes. / This dissertation provides a history of macroeconomic modeling practices from RobertE. Lucas’s works in the 1970s up to today’s dynamic stochastic general equilibrium (DSGE) approach. Working from a historical perspective, I suggest that the recent rise of DSGE models should be characterized as a compromise between opposing views of modeling methodology—on the one hand, the real business cycle (RBC) view, on the other hand, the new Keynesian view. In order to justify this claim, my work provides an epistemological reconstruction of the recent history of macroeconomics, building from ananalysis of the criteria defining the validity and the pertinence of a model. My assumption is that recent macroeconomic modeling practices can be described by three distinctive methodological criteria : the internal validity criterion (which establishes the consistency between models’ assumptions and concepts and formalisms of a theory), the external validity criterion (which establishes the consistency between the assumptions and results of a model and the real world, as well as the quantitative methods needed to assess such a consistency) and the hierarchization criterion (which establishes the preference for internal over external validity, or vice versa). This epistemological reconstruction draws primarily from the literature about models in the philosophy of science. My work aims to make four contributions to the history of recent macroeconomics. (1) To understand the rise of DSGE models without referring to the explanation providedby the macroeconomists themselves, who tend to think that macroeconomics evolved through theoretical consensus and exogenous technical progress. By distancing itself fromthis perspective, my work draws attention to the disruptive character of methodological controversies and to the interdependence between theoretical activity and the developmentof statistical and econometric methods. (2) To overcome the existing divide betweenthe history of macroeconomic theories and the history of quantitative methods. Throughits epistemological perspective, my work reconciles these two historiographies and specifiesthe basis for a comprehensive understanding of recent developments in macroeconomics.(3) To put the accent on the external validity condition as the main controversial issue separating different views of macro-modeling methodology. Furthermore, I illustrate how the debate about external validity is closely related to the problem of casual explanation and, finally, to the conditions for providing economic policy evaluation. (4) To characterize the DSGE approach: although DSGE models are often presented as a“synthesis”, or as a “consensus”, they are better described as a shaky compromise between two opposing methodological visions.
30

Estimação de modelos DSGE usando verossimilhança empírica e mínimo contraste generalizados / DSGE Estimation using Generalized Empirical Likelihood and Generalized Minimum Contrast

Boaretto, Gilberto Oliveira 05 March 2018 (has links)
O objetivo deste trabalho é investigar o desempenho de estimadores baseados em momentos das famílias verossimilhança empírica generalizada (GEL) e mínimo contraste generalizado (GMC) na estimação de modelos de equilíbrio geral dinâmico e estocástico (DSGE), com enfoque na análise de robustez sob má-especificação, recorrente neste tipo de modelo. Como benchmark utilizamos método do momentos generalizado (GMM), máxima verossimilhança (ML) e inferência bayesiana (BI). Trabalhamos com um modelo de ciclos reais de negócios (RBC) que pode ser considerado o núcleo de modelos DSGE, apresenta dificuldades similares e facilita a análise dos resultados devido ao menor número de parâmetros. Verificamos por meio de experimentos de Monte Carlo se os estimadores estudados entregam resultados satisfatórios em termos de média, mediana, viés, erro quadrático médio, erro absoluto médio e verificamos a distribuição das estimativas geradas por cada estimador. Dentre os principais resultados estão: (i) o estimador verossimilhança empírica (EL) - assim como sua versão com condições de momento suavizadas (SEL) - e a inferência bayesiana (BI) foram, nesta ordem, os que obtiveram os melhores desempenhos, inclusive nos casos de especificação incorreta; (ii) os estimadores continous updating empirical likelihood (CUE), mínima distância de Hellinger (HD), exponential tilting (ET) e suas versões suavizadas apresentaram desempenho comparativo intermediário; (iii) o desempenho dos estimadores exponentially tilted empirical likelihood (ETEL), exponential tilting Hellinger distance (ETHD) e suas versões suavizadas foi bastante comprometido pela ocorrência de estimativas atípicas; (iv) as versões com e sem suavização das condições de momento dos estimadores das famílias GEL/GMC apresentaram desempenhos muito similares; (v) os estimadores GMM, principalmente no caso sobreidentificado, e ML apresentaram desempenhos consideravelmente abaixo de boa parte de seus concorrentes / The objective of this work is to investigate the performance of moment-based estimators of the generalized empirical likelihood (GEL) and generalized minimum contrast (GMC) families in the estimation of dynamic stochastic general equilibrium (DSGE) models, focusing on the robustness analysis under misspecification, recurrent in this model. As benchmark we used generalized method of moments (GMM), maximum likelihood (ML) and Bayesian inference (BI). We work with a real business cycle (RBC) model that can be considered the core of DSGE models, presents similar difficulties and facilitates the analysis of results due to lower number of parameters. We verified, via Monte Carlo experiments, whether the studied estimators presented satisfactory results in terms of mean, median, bias, mean square error, mean absolute error and we verified the distribution of the estimates generated by each estimator. Among the main results are: (i) empirical likelihood (EL) estimator - as well as its version with smoothed moment conditions (SEL) - and Bayesian inference (BI) were, in that order, the ones that obtained the best performances, even in misspecification cases; (ii) continuous updating empirical likelihood (CUE), minimum Hellinger distance (HD), exponential tilting (ET) estimators and their smoothed versions exhibit intermediate comparative performance; (iii) performance of exponentially tilted empirical likelihood (ETEL), exponential tilting Hellinger distance (ETHD) and its smoothed versions was seriously compromised by atypical estimates; (iv) smoothed and non-smoothed GEL/GMC estimators exhibit very similar performances; (v) GMM, especially in the over-identified case, and ML estimators had lower performance than their competitors

Page generated in 0.0458 seconds