341 |
An empirical examination of bilateral seaborne trade flows in the world economyKavussanos, Manolis George January 1992 (has links)
The aim of this thesis is to construct a disaggregated econometric model of the pattern of bilateral seaborne trade flows. Commodities are classified into 5 categories according to the type of ship used in their transportation. Exports and imports are classified into 30 regions, according to the major sea-lanes used by ships. An understanding of the determinants of trade f lows at this level of disaggregation is important for shipowners. The use of disaggregated data also helps in the estimation of the price elasticities of traded goods, an issue of more general interest to exporters and policy makers. Our theoretical model borrows the ideas of multistage budgeting from consumer demand theory. The total imports of each importing region are allocated amongst their trade partners, depending on relative prices and trends in tastes. Our econometric implementation of the model uses the very general Constant Ratio of Elasticities of Substitution Homogeneous (CRESH) functional form. This encompasses the CES, LES, Cobb-Douglas and Leontief forms, more commonly used in trade models. Empirical implementation of the model has resulted in elasticity estimates which are much higher than those estimated in earlier trade models. This indicates a high degree of competition in international markets. The pattern of these elasticities suggest that importing regions establish a few trade partners internationally for the main bulk of their imports, while the proportion of their imports allocated to the remaining trade partners, is highly sensitive to relative prices.
|
342 |
Marx and modern value theoryWilson, David William January 2000 (has links)
Marx's value theory is standardly interpreted as a theory of price: qualitatively, as a kind )f materialist sociology, concerned to show how money, markets and prices, in a developed form, are the mere reflux of a particular mode of production, i. e. capitalism; and quantitatively, how the configuration of money prices may be related to the proportions of social labour-time expended in the various sectors of the economy. According to this view, Marx's value-project is foundational for the understanding of the laws of motion of capitalist society, a motion which is thence shown to rely on the performance and appropriation of surplus labour under capitalist relations of production. As such, human activity is taken to serve an alien will and intelligence, namely, that of capital. Apart from the last sentence (though, admittedly, this in itself is an important caveat) there is little in this standard reading of Marx's value-project to distinguish its orientation from that of the value-moderns. Moreover, in making the wage-contract and its performance the very site of the deformation of human purpose - in intimating that the imposition of a purpose alien to a properly human form of activity comes with the wage contract - any distinction between Marx and value-modernity is all but obliterated. I will want to argue below that, to avoid doing violence to his value-project, Marx's labour theory of value needs to be interpreted in such a way that labour is explicitly understood as a systematically constituted form of self-limiting activity. My aim is to show the various forms that this 'activity of alienation' takes and how the key value-phenomena, organised markets, prices and money, are implicated in this process of selflimitation.
|
343 |
The impact of the internal organisational environment on NSD knowledge management and NSD performanceKelly, David T. January 2000 (has links)
Building on a diverse stream of literature, knowledge strategy and NSD in particular, the theory investigated in this research is that service firms can use knowledge strategy to improve the innovation performance of their business. Although scholars are now beginning to organise their research agendas around a set of explicit hypotheses concerning the causes and effects of knowledge intensive environments, knowledge management has only recently emerged as an explicit area of pursuit for managing firms. This research empirically investigated knowledge management activity in the context of NSD, and as such provides insight into a subject area previously lacking in rigorous empirical studies. Previous research in the field of NSD has tended to concentrate its focus on the financial services market. In contrast, this research drew its sample from a wider population and identified that many of the findings appear to be generalisable across a number of business markets, both in a consumer and business context. In adopting a knowledge-based view of a service business, this research conceptualised a notion of NSD Knowledge Environment( NKE) to represent the way in which knowledge supports the business' ability to develop new services across a NSD programme. The nine unique bundles of resources which comprise the NKE (knowledge depth; knowledge dispersion; NSD memory, personal interaction; climate of learning; creative climate; entrepreneurial climate; collaborative climate; goal climate) were found to be capable of yielding sustainable, above-normal business performance. The NKE was discovered to have a significant impact on the service business' overall NSD programme performance across four distinct measures: financial; new opportunities; customer responsiveness innovation. Whilst service firms were found to be aware of the importance of knowledge resources to their business, few had embraced a business-wide' framework, for managing- particular knowledge 'assets.' This research therefore indicates the importance of addressing the need for. a knowledge management framework targets specifically at NSD success. Whilst the NKE was discovered to have a'significant impact on the service business' overall ability to develop innovative new products supports the findings of previous research, interestingly, this research identified a multi-dimensional concept of innovativeness, comprising both measures of innovative outcomes and innovative processes. The more innovative NSD programmes were found to be more successful on many' other performance dimensions (financial'and nonfinancial) than their counterparts,i. e. innovative NSD programmes were more successful than their competition on all dimensions, aswell as having a very high percentage of sales and profits originating from new services introduced in the last three years. The overall implication of this research is that if a firm's scarcer resources are the source of improved economic performance, it follows that supportive knowledge practices and a supportive internal knowledge environment must be created to ensure these assets are leveraged successfully.
|
344 |
International joint venture negotiation behaviour outcome : the role of bargaining power, culture and trust : qualitative case studiesSkuna, Jiraphan January 2000 (has links)
Most of the literature on joint ventures (JVs) in developing countries has been viewed from the perspective of the foreign partners, ignoring the strategic imperatives and goals of the host country partners. Additionally, there has been very little research on international joint ventures (IJVs) in Thailand. Therefore, a study of the relationships between bargaining power, trust and culture affecting negotiation behaviour and outcomes (JV performance) could clarify and complement the results of past studies. It could lead foreign and Thai investors to better understand what they should do before and after entering into JVs, so as to achieve an effective performance (outcome), the success of IJV, cost minimization and profit maximization. This study examines the variables mentioned above in the context of service industries (e.g. construction, leasing, gas distribution), within the confines of joint venture theory and negotiation theory. The data was gathered using both questionnaires and in-depth interviews with a number of MDs and senior managers of JV firms. Both Thai and foreign parents were interviewed where possible. The result of the study shows that relationships between bargaining power, trust, culture, negotiation behaviour and performance (outcome) exist. A significant external factor affecting JV performance was also identified. Case studies were used as a research strategy for this study. 'Pattern matching logic' and 'explanation building' techniques were used for the analysis of data. In addition, data display technique was added to offer a clear understanding and picture of the results of the study. Regarding JV management and negotiation, this study demonstrated that each JV partner should pay attention and time to support the development of mutual trust and cultural understanding in order to avoid conflict and enhance a successful JV performance. This study also revealed the effect of bargaining power, trust and culture on JV performance mediated by negotiation tactics. This has received little attention in previous studies.
|
345 |
Ten years of challenge : the impact of the external environment on charity fundraising and marketing 1989-1999Kay-Williams, Susan Margaret January 2001 (has links)
Change happens constantly, but sometimes several factors coincide, becoming a catalyst for major change. At the beginning of the 1990s there were indications that this was the case for fundraisers. This research begins by going behind the headlines to examine the external environment between 1989 and 1994 and its impact on voluntary fundraising. The research uses a qualitative approach and a grounded theory methodology to examine the changing environment as experienced by 30 heads of fundraising from charities of all sizes. Was the external environment making fundraising more difficult? If so, how were fundraisers responding, what strategies and structures were they adopting? Did it include marketing? From the research one expectation, that charities with large voluntary incomes would have replica fundraising departments, was found not to be the case. Further investigation showed that there was a life cycle for fundraising which was not necessarily in step with the charity as a whole. This discovery through the grounded theory approach led to the five-stages of fundraising, a framework for the development of charity fundraising. The framework identified a number of variables and criteria but also used organisation culture theory to contextualise fundraisers' comments. The framework helped to make sense of some charities' unexpected responses to the questions. Writing up was finished in 1999. A short longitudinal study was then added to compare the impact of the external environment in the second half of the 1990s with that of the first half. The longitudinal study also enabled further testing of the five stages theory to assess its ongoing validity and wider relevance to the sector. Therefore, this research comprises three elements: the original study, the five stages framework and the longitudinal study. It explores the full impact of external changes on fundraising and how fundraisers have responded through the decade.
|
346 |
An examination into the structure of freight rates in the shipping freight marketsvan Dellen, Stefan January 2011 (has links)
This thesis investigates three salient areas of interest in the structure of freight rates in the shipping market, with a particular focus on the tanker and dry-bulk sectors, using recent econometric and time series techniques. The questions asked are: 1) do spot freight rate levels follow a fractionally integrated process, as opposed to being stationary or non-stationary, as had previously been proposed; 2) does spot freight rate volatility also follow a fractionally integrated process; and 3) do freight rates exhibit conditional skewness and kurtosis? It then evaluates the impact that these factors have on the risk exposure of market participants. These concepts are further tested in terms of their respective forecasting performance, relative to other more standard econometric techniques. An ongoing issue in the shipping literature is whether spot freight rate levels follow a stationary or non-stationary process. This thesis provides another dimension to this discussion by arguing that spot freight rate levels follow a fractionally integrated process. The rationale behind this argument is the fact that the supply and demand dynamics in this market mean that although freight rates are mean-reverting overall, the process of mean-reversion occurs with a delay, which is exactly how one would expect a fractionally integrated process to behave. Although in-sample results were promising in that fractionally integrated models are found to outperform their stationary and non-stationary counterparts across sectors and vessel sizes, out-of-sample forecasts indicate that models that assumed stationarity or non-stationarity outperformed these models, depending on the sector and vessel size. Additionally, the thesis extends this debate to the volatility of these spot freight rate levels, where it is proposed that volatility also follows a fractionally integrated process. In-sample results from the estimation of Generalised Autoregressive Conditional Heteroscedasticity (GARCH), Integrated Generalised Autoregressive Conditional Heteroscedasticity (IGARCH) and Fractionally Integrated Generalised Autoregressive Conditional Heteroscedasticity (FIGARCH) models indicate that FIGARCH models outperformed the other two models across all sectors and vessel sizes, however, when calculating the respective out-of-sample Values-at-Risk for each 18 vessel type, non-parametric models are found, in most cases, to outperform their parametric counterparts across sectors and vessel sizes. This thesis finally examines whether freight rates exhibit conditional skewness and kurtosis, where the shape of the supply function in the shipping freight markets indicates that these would not be constant over time, as is assumed by other standard models. Results for the in-sample period indicate that the Generalised Autoregressive Conditional Heteroscedasticity with Skewness and Kurtosis (GARCHSK) models outperformed GARCH and FIGARCH models. This being said, when calculating the respective out-of-sample Values-at-Risk for each vessel type, non-parametric models are found, in most cases, to outperform their parametric counterparts across sectors and vessel sizes.
|
347 |
Essays on price rigidity in the UK : evidence from micro data and implications for macro modelsTian, Kun January 2012 (has links)
This study consists of three individual essays which all shed light on assessing the price rigidity by using price micro data in the UK. The relevant implications for macro models are also discussed in each essay respectively. The first essay gives a unified framework a la Dixon (2012) to gauge the price rigidity from three perspectives: frequency, hazard function and distribution across firms. On average, the monthly frequency of consumer price change is 19% between 1996 and 2007. Sales and substitutions will significantly affect the frequency of consumer price change. The frequency of consumer price change varies considerable across sectors. The fraction of price changes which are decreasing is about 40%. The hazard function is downward sloping with 12-month spike. The censoring and sampling issues in the estimation of hazard function are discussed thoroughly. The distribution across firms is derived from estimated hazard function, which is consistent with the frequency of price changes. Two benchmark sticky price models are calibrated and simulated. Furthermore, a multiple Calvo and multiple menu costs model are also simulated, based on the empirical finding in micro data. The simulation results suggest that introducing heterogeneity into sticky price models can improve models' fitness in respect to matching micro evidence. The second essay mainly focus on "the monthly frequency of price changes", which is a prominent feature of many studies of the CPI micro-data. In this essay, we see how much the frequency ties down the behavior of price-setters ("firms") in steady-state in terms of the average length of price-spells across firms. We are able to divide an upper and lower bound for the mean duration of price-spells averaged across firms. We use the UK CPI data at the aggregate and sectoral level and find that the actual mean is about twice the theoretical minimum consistent with the observed frequency. We estimate the distribution using the hazard function and find that although the estimated hazard differs significantly from the Calvo distribution, the means and medians are similar. However, despite the micro differences, we find that the artificial Calvo distributions generated using the sectoral frequencies result in very similar impulse responses to the estimated hazards when used in the Smets-Wouters (2003) model. The third essay examines the behavior of individual producer prices in the UK. A number of stylized facts about price setting behavior are uncovered. A time-varying Ss model is set up in a way that is consistent with the stylized facts obtained from the UK PPI data. A duration model (semiparametric survival analysis model) is built in line with the time-varying Ss model. This duration model is estimated by controlling for observed and unobserved heterogeneity across firms. The estimation results suggest that the increase in the inflation rate will significantly increase the hazard rate of price change. The other factors considered in the model will also affect the hazard rate of price change, while in different magnitude.
|
348 |
Monetary policy coordination between the United States and the Euro Area : an application of indirect inference to a two-country DSGE modelHong, Yuqun January 2013 (has links)
Calls for monetary policy coordination has increased as the intensifed macro- economic interdependence cultivates the conflict of interests between economics, especially following the current crisis. Yet the literature has not reached a con- sensus on whether monetary policy coordination is welfare-improving. This thesis, taking from another perspective, assesses the real-world existence and extent of monetary policy coordination associated with economic interdepen- dence between the United States (US) and the Euro Area (EA), and investigates the changes of international transmission in the presence of coordination. Monetary policy coordination is represented by direct responses of monetary policy instruments to contemporaneous and lagged values of the real exchange rate. By using the method of indirect inference, this research also incorporates historical data into in-sample evaluation and estimation of the 'Dynamic Sto- chastic General Equilibrium'(DSGE) model. Beginning with indirect inference evaluations of a two-country DSGE model of the US and EA, it is found that models with coordination generally outperform their non-coordination counterpartindicating the existence of coordination. The real exchange rate is the essence of such improvement in the model's efficacy; and it is shown that coordination models have an excellent ability to replicate real exchange rate dynamics and volatility relative to a non-coordination model, even though it still remains a source of relatively poor performance of model. By applying an extensive indirect inference estimation, the existence of mon- etary coordination is ascertained since a partial-coordination model outstrips the non-coordination model remarkably. Both the US and EA economies exhibit moderate to high levels of monetary coordination. Such features improve the model's performance; particularly in terms of dynamics of US time series, volatil- ity of EA time series and both dynamics and volatility of the real exchange rate. Impulse responses and variance decomposition reveal substantial cross-country spillovers in contrast to the non-coordination model case.
|
349 |
The Taylor principle and the Fisher relation in general equilibriumDavies, Ceri Rees January 2013 (has links)
This thesis presents a structural framework which accounts for two key empirical phenomena in monetary economics: the ‘Taylor principle’ and the ‘Fisher relation’. The former suggests that there exists a greater-than-proportional relationship between the nominal interest rate and inflation in the short-run and the latter implies that a one-for-one relationship holds at lower frequencies. Although these relationships do feature in the ubiquitous, ‘cashless’ New Keynesian framework, it has been suggested that monetary variables are required in order to render this model ‘complete’ (e.g. Nelson, 2008a). Chapter-I demonstrates that an ‘implicit’ interest rate rule can be derived as a general equilibrium condition of models in which the central bank adheres to a money growth rule. Chapter-II compares the equilibrium condition of a standard cash-in-advance model to the interest rate rule of Taylor (1993) for a post-war sample of U.S. data. However, we demonstrate that in order to replicate the Taylor principle, the underlying model must be generalised to allow the velocity of money to vary. We use the model of Benk et al. (2008, 2010) to do so and show analytically that the resulting ‘implicit rule’ features the requisite greater-than-proportional relationship. Chapter-III applies standard econometric techniques to simulated data obtained from the Benk et al. model and the estimates obtained offer support for this theoretical prediction. Chapter-IV establishes that the Fisher relation emerges when low frequency trends in the simulated data are retained and under a related ‘long-run’ implicit rule. Chapter-IV also considers the post-war sample of U.S. data analysed in Chapter-II. While disparate empirical literatures have obtained evidence for both the Taylor principle and the Fisher relation, we show that these results can be obtained from a unified theoretical framework. Several restricted empirical specifications further suggest that standard interest rate rules which omit monetary variables might provide biased coefficient estimates.
|
350 |
Financial development and growth : testing a dynamic stochastic general equilibrium model via indirect inferenceRaoukka, Katerina January 2013 (has links)
Macroeconomics research has made a quantum leap in the past decade in establishing a new workhorse model for open economy analysis. The unique characteristic of this literature is the introduction of the financial system in a dynamic general equilibrium (DGE) model which is based on microfoundations. Its introduction in a DGE model is essential to explain empirical facts such as growth differences across countries. The aim of this thesis is to show whether the behavior of growth can be explained by financial development within a classical approach. The model's ability to explain growth by setting financial development as a causal factor is tested against the model's performance to explain growth via setting the human capital as a causal factor. The question proposed and answered in this thesis is the following: Can an increase in productivity be produced by a development in the financial system or in the educational system and if so, is growth determined by this increase in productivity? The empirical performance of DSGE models is under scrutiny by researchers. This thesis is introducing the reader to a fairly new and unfamiliar testing procedure; indirect inference which is fully explained and applied. The idea of the thesis is to provide a better identified model compared to the already established econometric models on the financial development and growth nexus. The procedure followed is firstly to set up a well-established microfounded model and then to connect it to the theory via an establishment of the time series properties of various macroeconomic variables. The results based on 10 sample countries indicate that setting financial development as a causal factor explains the data behavior of macroeconomic variables better than a model which considers human capital as a driver of economic growth. 1
|
Page generated in 0.0768 seconds