• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 241
  • 185
  • 7
  • 5
  • 5
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 1315
  • 929
  • 906
  • 289
  • 289
  • 201
  • 171
  • 169
  • 104
  • 97
  • 96
  • 80
  • 70
  • 69
  • 69
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Monetary policy coordination between the United States and the Euro Area : an application of indirect inference to a two-country DSGE model

Hong, Yuqun January 2013 (has links)
Calls for monetary policy coordination has increased as the intensifed macro- economic interdependence cultivates the conflict of interests between economics, especially following the current crisis. Yet the literature has not reached a con- sensus on whether monetary policy coordination is welfare-improving. This thesis, taking from another perspective, assesses the real-world existence and extent of monetary policy coordination associated with economic interdepen- dence between the United States (US) and the Euro Area (EA), and investigates the changes of international transmission in the presence of coordination. Monetary policy coordination is represented by direct responses of monetary policy instruments to contemporaneous and lagged values of the real exchange rate. By using the method of indirect inference, this research also incorporates historical data into in-sample evaluation and estimation of the 'Dynamic Sto- chastic General Equilibrium'(DSGE) model. Beginning with indirect inference evaluations of a two-country DSGE model of the US and EA, it is found that models with coordination generally outperform their non-coordination counterpartindicating the existence of coordination. The real exchange rate is the essence of such improvement in the model's efficacy; and it is shown that coordination models have an excellent ability to replicate real exchange rate dynamics and volatility relative to a non-coordination model, even though it still remains a source of relatively poor performance of model. By applying an extensive indirect inference estimation, the existence of mon- etary coordination is ascertained since a partial-coordination model outstrips the non-coordination model remarkably. Both the US and EA economies exhibit moderate to high levels of monetary coordination. Such features improve the model's performance; particularly in terms of dynamics of US time series, volatil- ity of EA time series and both dynamics and volatility of the real exchange rate. Impulse responses and variance decomposition reveal substantial cross-country spillovers in contrast to the non-coordination model case.
292

The Taylor principle and the Fisher relation in general equilibrium

Davies, Ceri Rees January 2013 (has links)
This thesis presents a structural framework which accounts for two key empirical phenomena in monetary economics: the ‘Taylor principle’ and the ‘Fisher relation’. The former suggests that there exists a greater-than-proportional relationship between the nominal interest rate and inflation in the short-run and the latter implies that a one-for-one relationship holds at lower frequencies. Although these relationships do feature in the ubiquitous, ‘cashless’ New Keynesian framework, it has been suggested that monetary variables are required in order to render this model ‘complete’ (e.g. Nelson, 2008a). Chapter-I demonstrates that an ‘implicit’ interest rate rule can be derived as a general equilibrium condition of models in which the central bank adheres to a money growth rule. Chapter-II compares the equilibrium condition of a standard cash-in-advance model to the interest rate rule of Taylor (1993) for a post-war sample of U.S. data. However, we demonstrate that in order to replicate the Taylor principle, the underlying model must be generalised to allow the velocity of money to vary. We use the model of Benk et al. (2008, 2010) to do so and show analytically that the resulting ‘implicit rule’ features the requisite greater-than-proportional relationship. Chapter-III applies standard econometric techniques to simulated data obtained from the Benk et al. model and the estimates obtained offer support for this theoretical prediction. Chapter-IV establishes that the Fisher relation emerges when low frequency trends in the simulated data are retained and under a related ‘long-run’ implicit rule. Chapter-IV also considers the post-war sample of U.S. data analysed in Chapter-II. While disparate empirical literatures have obtained evidence for both the Taylor principle and the Fisher relation, we show that these results can be obtained from a unified theoretical framework. Several restricted empirical specifications further suggest that standard interest rate rules which omit monetary variables might provide biased coefficient estimates.
293

Financial development and growth : testing a dynamic stochastic general equilibrium model via indirect inference

Raoukka, Katerina January 2013 (has links)
Macroeconomics research has made a quantum leap in the past decade in establishing a new workhorse model for open economy analysis. The unique characteristic of this literature is the introduction of the financial system in a dynamic general equilibrium (DGE) model which is based on microfoundations. Its introduction in a DGE model is essential to explain empirical facts such as growth differences across countries. The aim of this thesis is to show whether the behavior of growth can be explained by financial development within a classical approach. The model's ability to explain growth by setting financial development as a causal factor is tested against the model's performance to explain growth via setting the human capital as a causal factor. The question proposed and answered in this thesis is the following: Can an increase in productivity be produced by a development in the financial system or in the educational system and if so, is growth determined by this increase in productivity? The empirical performance of DSGE models is under scrutiny by researchers. This thesis is introducing the reader to a fairly new and unfamiliar testing procedure; indirect inference which is fully explained and applied. The idea of the thesis is to provide a better identified model compared to the already established econometric models on the financial development and growth nexus. The procedure followed is firstly to set up a well-established microfounded model and then to connect it to the theory via an establishment of the time series properties of various macroeconomic variables. The results based on 10 sample countries indicate that setting financial development as a causal factor explains the data behavior of macroeconomic variables better than a model which considers human capital as a driver of economic growth. 1
294

Examining the interrelationships between the four stages of customer loyalty : a mixed method approach

El-Manstrly, Dahlia January 2010 (has links)
Customer loyalty has received enormous attention from both academics and practitioners alike. There is growing evidence that keeping existing customers costs less than attracting new ones. Furthermore, loyal customers are expected to pay more, spend more and act as advocates for a particular service organisation. However, despite the universal agreement on the benefits of customer loyalty, no consensus has been yet reached on how or why this phenomena occurs. This lack of clarity in conceptualising and in turn operationalising, the concept has lead to an incomplete understanding of the phenomena and an inability to answer the question why satisfied customers switch. Therefore, research is needed to shed more light upon service loyalty development. This study will assist service managers and marketeers in focusing on effective marketing strategies for building and sustaining service loyalty to maximise the return on their investment. Particularly in this time of recession, service managers need to take greater care with regards to decision making and expenditure. To achieve the aim of this study, an embedded three-sequential mixed method research design was adopted. First, in-depth interviews were used to understand UK service loyalty in the retail service industry and to justify the inclusion or exclusion of loyalty variables. Second, two factorial experiments were conducted to test the cause and effect of the links between cognitive and affective loyalty and between affective and conative loyalty. Then the moderating effect of switching costs on these two links was examined. Third, two surveys were distributed to a random sample of retail customers to test the full conceptual model and to assess the generalisability of the results across high versus low-employee contact, customisation and personalisation service types. Multigroup analysis was also used to test the moderating effect of customer (e.g. age, gender and education) and market characteristics (e.g. switching costs) on the links between the loyalty stages. The main findings of the study are: first, service loyalty develops in a sequential manner; cognitive loyalty → trust → affective loyalty → commitment → conative loyalty → action loyalty. Second, trust partially mediates the links between cognitive and affective loyalty. Third, unexpectedly, a significant mediating effect of calculative commitment on the link between affective and conative loyalty is found only when the moderating effect of education is examined. Fourth, the links between loyalty stages are equally strong between high and low service customers as well as between younger and older customers. Fifth, education, gender and switching costs have a significant moderating effect on the early, but not the later stages of loyalty. This study offers a rare test of the Theory of Planned Behaviour within a non- contractual customer-service context. It augments existing research on validating Oliver’s (1997) loyalty model by proposing a mediating effect of trust and commitment and a moderating effect of service type, switching costs, gender, age and education. The main conclusions of this study are: first, building and sustaining service loyalty is a complex strategic management objective. Second, understanding the key factors that drive service loyalty and the conditions that either enhance or hinder its development is essential for a complete understanding of the phenomena. Therefore, the overarching managerial implication of this study is to provide service managers with a segmentation tool and a framework to build attitudinal loyalty and to reach behavioural loyalty goals. Guided by the results of this study, services managers will be better informed in selecting the most effective marketing strategy to a particular group of customers and in identifying and moving customers with varying degree(s) of loyalty along the loyalty ladder. Fruitful areas for future research would include examining the conceptual model across different service, culture and industry contexts. Moreover, although this study strives to capture the dynamic nature of the construct, future research that uses longitudinal design would be needed to test the relationships in the full conceptual model over time. Finally, collecting objective data or data from different sources would also be useful to control for the possible effect of common method variance on coefficient estimates in survey research.
295

Large data sets and nonlinearity : essays in international finance and macroeconomics

Kim, Hyeyoen January 2009 (has links)
This thesis has aimed to investigate whether the information in large macroeconomic data sets is relevant for resolving some of puzzling and questionable aspects of international finance and macroeconomics. In particular, we employ the diffusion indices (DIs) analysis in order to capture the very large data sets into a small number of factors. Applications of factors into conventional model specifications address the following main issues. Using factor-augmented vector autoregressive (FAVAR) models, we measure the impact of the UK and US monetary policy. This approach notably mitigates the ‘price puzzle’ for both economies, whereby a monetary tightening appears to have perverse effects on price movements. We also estimate structural FAVARs and examine the impact of aggregate-demand and aggregate-supply using a recursive long-run multiplier identification procedure. This method is applied to examine the evidence for increased UK macroeconomic flexibility following the UK labour market reforms of the 1980s. For forecasting purpose, factors are employed as ‘unobserved’ fundamentals, which direct the movement of exchange rates. From the long-run relationship between factor-based fundamentals and exchange rate, the deviation from the fundamental level of exchange rate is exploited to improve the predictive performance of the fundamental model of exchange rates. Our empirical results suggest that there is strong evidence that factors are helpful to predict the exchange rates as the horizons becomes more elongated, better than random walk and the standard monetary fundamental models. Finally, we explore whether allowing for a wide range of influences on the real exchange rate in a nonlinear framework can help to resolve the ‘PPP puzzle’. Factors, as determinants of the time-varying equilibrium of real exchange rates, are incorporated into a nonlinear framework. Allowing for the effects of macroeconomic factors dramatically increases the measured speed of adjustment of the real exchange rate.
296

Essays in time series analysis : modelling stochastic volatility and forecast evaluation

Malik, Sheheryar January 2009 (has links)
No description available.
297

Economistic fallacies in contemporary capitalism : a Polanyian analysis of regimes of marketised social protection

Holmes, Christopher January 2010 (has links)
Karl Polanyi used the notion of economistic fallacy in order to flag up the way in which formal definitions of the economy – rooted in the assumption of economising, self-interested market behaviour – were routinely applied as universal and rational by economists, political scientists, policy makers and in general public discourse. This thesis is a critical re-application of the notion of economistic fallacy in theoretical, historical and contemporary perspective. I argue that, although Polanyi’s broad generalisations are unsuitable for contemporary analysis, the same basic type of fallacy can be observed in various specific policy settings. Roughly speaking, the thesis comprises two halves. In the first, I focus on theoretical matters, arguing for a consideration of Polanyi specifically as a political economist of ideas. This, I argue, gets us closer to some of Polanyi’s most interesting analytical intentions whilst freeing us from some of the apparent ontological contradictions latent in his various texts. From there, I develop Polanyi’s insights on the role of ideas in capitalist development, foregrounding the notion of economistic fallacy as a key conceptual device. In the second half of the thesis, I apply this analysis over three case studies, one on global financial regulation, one on climate change and one on welfare provision in the UK. These areas are chosen as contemporary reflections of the three ‘fictitious commodities’ that Polanyi identified as uniquely important loci of economistically fallacious logics, namely money, land and labour. In each case, I note how specific versions of economistic fallacy have guided policies that aim to deliver forms of social protection via market mechanisms and market actors – what I call ‘marketised social protection’. This is in distinction to the straightforward (often state-led) societal self-protection that Polanyi and latter-day Polanyians have typically focused upon. I argue that the policies discussed are economistically fallacious to the extent that they rely on unrealistic, overly rationalist assumptions about the nature of society, the natural environment and people, respectively. I show instead how the dynamics of capital accumulation that such regimes serve to legitimate and protect – dynamics that I refer to as forms of ‘market self-protection’ – act to continually undermine the success of such policy programmes. This, I argue, is a distinctive tension in the ideational and material landscape of contemporary capitalism.
298

Public procurement auctions in Brazil

Szerman, Dimitri January 2012 (has links)
This thesis provides an empirical analysis of data generated by ComprasNet, the online procurement bidding platform developed and used by the Brazilian federal government. ComprasNet is a large bidding platform used since 2001 by more than 2200 public purchasing units who list around one million lots each year. Over 70,000 unique bidders have participated in these auctions. In 2010, 46 percent of all procurement for the federal government was conducted through ComprasNet, totaling R$ 27 billion, or 0.7 percent of Brazil’s GDP. In short, these auctions represent a large share of federal tenders and a substantial amount is contracted through them each year. Chapter 1 provides an overview of ComprasNet. After reviewing the literature on various topics which this dissertation contributes to, I describe the institutional background surrounding ComprasNet. I then present the baseline data used throughout the remainder of this dissertation. Chapter 2 addresses one important aspect of designing an online ascending auction, namely how to end the auction. ComprasNet varied its ending rules over time, providing an unique opportunity to test theories of bidder behaviour, as well as assessing the impact of ending rules on auction outcomes. Chapter 3 analyses a two-stage auction format which ComprasNet uses. Two-stage designs have long been proposed by the theoretical literature, but there are virtually no empirical works apart from experimental studies. Finally, chapter 4 analyses a bid preference programme targeted at small and micro enterprises (SMEs). The programme consists of setting aside eligible lots for SMEs. We first use eligibility rules as a source of exogenous variation in the treatment assignment to estimate the effects of the programme on auction outcomes. We then set up an open auction model with endogenous entry and asymmetric bidders and estimate the model’s primitives. In particular, we estimate entry costs, which we interpret as red tape costs.
299

Essays on markets with frictions : applications to the housing, labour and financial markets

Ungerer, Christoph January 2012 (has links)
The classical treatment of market transactions in economics presumes that buyers and sellers engage in transactions instantly and at no cost. In a series of applications in the housing market, the labour market and the market for corporate bonds, this thesis shows that relaxing this assumption has important implications for Macroeconomics and Finance. The first chapter combines theory and empirical evidence to show that search frictions in the housing market imply a housing liquidity channel of monetary policy transmission. Expansionary monetary policy attracts buyers to the housing market, raising housing liquidity. Higher housing sale rates in turn allow lenders to threaten foreclosure more effectively, because the expected carrying costs on foreclosure inventory are lower. Ex-ante, this makes banks willing to offer larger loans, stimulating aggregate demand. The second chapter uses a heterogeneous firm industry model to explore how the macroeconomic response to a temporary employer payroll tax cut depends on the hiring and firing costs faced by firms. Controversially, the presence of non-convex labour adjustment costs suggests that tax cuts create fewer jobs in recessions. When firms hoard labour during downturns, they do not respond to marginal tax cuts by hiring additional workers. The third chapter develops a theory in which trader career concerns generate an endogenous transaction friction. Traders are reluctant to sell assets below historical purchase price, since realizing a loss signals to the employer that the trader is incompetent. The chapter documents empirically several properties of corporate bond transaction data consistent with this theory of career-concerned traders.
300

Essays in applied macroeconomic theory : volatility, spreads, and unconventional monetary policy tools

Vega, Hugo January 2012 (has links)
This thesis contains three essays that employ macroeconomic theory to study the implications of volatility, financial frictions and reserve requirements. The first essay uses an imperfect information model where agents solve a signal extraction problem to study the effect of volatility on the economy. A real business cycle model where the agent faces imperfect information regarding productivity is used to address the question. The main finding is that the variance of the productivity process components has a small negative short run impact on the economy's real variables. However, imperfect information dampens the effects of volatility associated to permanent components of productivity and amplifies the effects of volatility associated to transitory components. The second essay presents a partial equilibrium characterization of the credit market in an economy with partial financial dollarization. Financial frictions (costly state verification and banking regulation restrictions), are introduced and their impact on lending and deposit interest rates denominated in domestic and foreign currency studied. The analysis shows that reserve requirements act as a tax that leads banks to decrease deposit rates, while the wedge between foreign and domestic currency lending rates is decreasing in exchange rate volatility and increasing in the degree of correlation between entrepreneurs' returns and the exchange rate. The third essay introduces an interbank market with two types of private banks and a central bank into a New-Keynesian DSGE model. The model is used to analyse the general equilibrium effects of changes to reserve requirements, while the central bank follows a Taylor rule to set the policy interest rate. The paper shows that changes to reserve requirements have similar effects to interest rate hikes and that both monetary policy tools can be used jointly in order to avoid big swings in the policy rate or a zero bound.

Page generated in 0.0254 seconds