351 |
Four Essays Analyzing the Impacts of Policy and System Changes on Power Sector EmissionsKindle, Andrew 02 July 2015 (has links)
<p> The Regional Greenhouse Gas Initiative (RGGI) is a regionally based carbon dioxide (CO<sub>2</sub>) cap and trade policy. A potential weakness of regional emissions trading policies is that they can incur "leakage" if emission reductions in the targeted area are accomplished by relying more on imports, thereby causing offsetting emission increases in the regions supplying the imports. The member state of New York shares a long electrically interconnected border with non-member state Pennsylvania. Pennsylvania is a source of many coal plants and statewide emissions may increase if coal power is exported to New York. RGGI Leakage is empirically tested for using several models.</p><p> A method is demonstrated to empirically estimate emission and fuel use functions for fuel-burning electric generation units in Texas. Emission functions are necessary for estimating emissions and fuel use when measurements are not available such as in power system simulation scenarios, unit commitment and dispatch decisions, and when measurement equipment is absent, turned off, or malfunctioning. Commonly, the "functions" used assume that emissions of a generation unit are simply a constant multiple of its output. The functions include the impacts of ramping, startup, and shutdown on emissions. The method of their estimation is described and can be extended to any fuel-burning generator in the U.S. that reports hourly generation and emissions via the EPA's Continuous Emissions Monitoring System (CEMS). The accuracy of the emission functions in predicting in-sample and forecasting out-of-sample is shown.</p><p> The regulations governing the reporting requirements for emissions under various EPA mandates offer a possible loophole by way of a calibration exemption. Generators that report emissions from CEMS equipment must calibrate the equipment once every 24 hours. During the hour of calibration generators can take advantage of different emission rates during that hour to under-report emissions. This has potential cost savings due to the need for generators to hold allowances for NO<sub>x</sub> and SO<sub>2</sub> emissions. CEMS data containing the additional information of the hour in which generators calibrate is analyzed to determine if generators are utilizing this loophole.</p><p> The emission functions, which can estimate the impact of calibration on reported emissions, are then used to determine the magnitude of unreported emissions. The emission functions are then used to address a controversy about the emission effects of wind power. Because wind power increases the frequency of startups, shutdowns, and ramping by fuel-burning generators, some have claimed that wind power actually increases emissions. Some have also claimed that emissions reductions may not be as large as constant emissions rates would indicate. Emission functions are calculated for all of the combustion-based generators in Texas, and applied to the output of differing wind power penetration scenarios to carefully estimate the emission impacts of increased wind power penetration.</p>
|
352 |
Three essays on investments and time series econometricsBrooks, Joshua Andrew 23 July 2015 (has links)
<p>This dissertation includes three essays on investments and time series econometrics. This work gives new insight into the behavior of implied marginal tax rates, implied volatility, and option pricing models.
The first essay examines the movement of implied marginal tax rates. A body of research points to the existence of implied marginal tax rates that can be extracted from security or derivative prices. We use the LIBOR-based interest rate swap curve and the MSI-based interest rate swap curve to examine changes in the implied tax rate. We document multiple statistically and economically significant structural breaks in the long-run implied marginal tax rate that are not exclusively located in the financial crisis (one as recent as October, 2010). These breaks represent persistent divergence from long run averages and indicate that mean reversion models may not accurately describe the stochastic processes of implied marginal tax rates.
In the second essay, I develop an asymmetric time series model of the VIX. I show that the VIX and realized volatility display significant nonlinear effects which I approximate with a smooth-transition autoregressive model. I find that under certain regimes the VIX depends almost exclusively on previous realized volatility. Under other regimes, I find that the VIX depends on both its lags and previous realized volatility. Since the VIX has become a popular hedging instrument, this finding has important implications for risk managers who elect to use the VIX and its related investment vehicles. It also has implications for the use of implied volatility in value-at-risk forecasting.
The third essay presents a new model for option pricing model selection. There is a significant performativity issue intrinsic in much of the option pricing literature. Once an option-pricing model (OPM) gains widespread acceptance, volatilities tend to move so that the OPM fits well with observed prices. This often leads to systematic mispricing based purely on model results. A number of systematic issues such as volatility smile are present in OPMs. To remedy this issue, I propose a new method for ranking OPMs based on one step ahead forecasts. This model transforms the data to build a distribution of the stochastic term present in OPM. This sample distribution is then tested for normality so that OPMs can be ranked in a Bayesian-like framework by their closeness to a normal distribution.
|
353 |
Examining the interrelationships between the four stages of customer loyalty : a mixed method approachEl-Manstrly, Dahlia January 2010 (has links)
Customer loyalty has received enormous attention from both academics and practitioners alike. There is growing evidence that keeping existing customers costs less than attracting new ones. Furthermore, loyal customers are expected to pay more, spend more and act as advocates for a particular service organisation. However, despite the universal agreement on the benefits of customer loyalty, no consensus has been yet reached on how or why this phenomena occurs. This lack of clarity in conceptualising and in turn operationalising, the concept has lead to an incomplete understanding of the phenomena and an inability to answer the question why satisfied customers switch. Therefore, research is needed to shed more light upon service loyalty development. This study will assist service managers and marketeers in focusing on effective marketing strategies for building and sustaining service loyalty to maximise the return on their investment. Particularly in this time of recession, service managers need to take greater care with regards to decision making and expenditure. To achieve the aim of this study, an embedded three-sequential mixed method research design was adopted. First, in-depth interviews were used to understand UK service loyalty in the retail service industry and to justify the inclusion or exclusion of loyalty variables. Second, two factorial experiments were conducted to test the cause and effect of the links between cognitive and affective loyalty and between affective and conative loyalty. Then the moderating effect of switching costs on these two links was examined. Third, two surveys were distributed to a random sample of retail customers to test the full conceptual model and to assess the generalisability of the results across high versus low-employee contact, customisation and personalisation service types. Multigroup analysis was also used to test the moderating effect of customer (e.g. age, gender and education) and market characteristics (e.g. switching costs) on the links between the loyalty stages. The main findings of the study are: first, service loyalty develops in a sequential manner; cognitive loyalty → trust → affective loyalty → commitment → conative loyalty → action loyalty. Second, trust partially mediates the links between cognitive and affective loyalty. Third, unexpectedly, a significant mediating effect of calculative commitment on the link between affective and conative loyalty is found only when the moderating effect of education is examined. Fourth, the links between loyalty stages are equally strong between high and low service customers as well as between younger and older customers. Fifth, education, gender and switching costs have a significant moderating effect on the early, but not the later stages of loyalty. This study offers a rare test of the Theory of Planned Behaviour within a non- contractual customer-service context. It augments existing research on validating Oliver’s (1997) loyalty model by proposing a mediating effect of trust and commitment and a moderating effect of service type, switching costs, gender, age and education. The main conclusions of this study are: first, building and sustaining service loyalty is a complex strategic management objective. Second, understanding the key factors that drive service loyalty and the conditions that either enhance or hinder its development is essential for a complete understanding of the phenomena. Therefore, the overarching managerial implication of this study is to provide service managers with a segmentation tool and a framework to build attitudinal loyalty and to reach behavioural loyalty goals. Guided by the results of this study, services managers will be better informed in selecting the most effective marketing strategy to a particular group of customers and in identifying and moving customers with varying degree(s) of loyalty along the loyalty ladder. Fruitful areas for future research would include examining the conceptual model across different service, culture and industry contexts. Moreover, although this study strives to capture the dynamic nature of the construct, future research that uses longitudinal design would be needed to test the relationships in the full conceptual model over time. Finally, collecting objective data or data from different sources would also be useful to control for the possible effect of common method variance on coefficient estimates in survey research.
|
354 |
Large data sets and nonlinearity : essays in international finance and macroeconomicsKim, Hyeyoen January 2009 (has links)
This thesis has aimed to investigate whether the information in large macroeconomic data sets is relevant for resolving some of puzzling and questionable aspects of international finance and macroeconomics. In particular, we employ the diffusion indices (DIs) analysis in order to capture the very large data sets into a small number of factors. Applications of factors into conventional model specifications address the following main issues. Using factor-augmented vector autoregressive (FAVAR) models, we measure the impact of the UK and US monetary policy. This approach notably mitigates the ‘price puzzle’ for both economies, whereby a monetary tightening appears to have perverse effects on price movements. We also estimate structural FAVARs and examine the impact of aggregate-demand and aggregate-supply using a recursive long-run multiplier identification procedure. This method is applied to examine the evidence for increased UK macroeconomic flexibility following the UK labour market reforms of the 1980s. For forecasting purpose, factors are employed as ‘unobserved’ fundamentals, which direct the movement of exchange rates. From the long-run relationship between factor-based fundamentals and exchange rate, the deviation from the fundamental level of exchange rate is exploited to improve the predictive performance of the fundamental model of exchange rates. Our empirical results suggest that there is strong evidence that factors are helpful to predict the exchange rates as the horizons becomes more elongated, better than random walk and the standard monetary fundamental models. Finally, we explore whether allowing for a wide range of influences on the real exchange rate in a nonlinear framework can help to resolve the ‘PPP puzzle’. Factors, as determinants of the time-varying equilibrium of real exchange rates, are incorporated into a nonlinear framework. Allowing for the effects of macroeconomic factors dramatically increases the measured speed of adjustment of the real exchange rate.
|
355 |
Essays in time series analysis : modelling stochastic volatility and forecast evaluationMalik, Sheheryar January 2009 (has links)
No description available.
|
356 |
Economistic fallacies in contemporary capitalism : a Polanyian analysis of regimes of marketised social protectionHolmes, Christopher January 2010 (has links)
Karl Polanyi used the notion of economistic fallacy in order to flag up the way in which formal definitions of the economy – rooted in the assumption of economising, self-interested market behaviour – were routinely applied as universal and rational by economists, political scientists, policy makers and in general public discourse. This thesis is a critical re-application of the notion of economistic fallacy in theoretical, historical and contemporary perspective. I argue that, although Polanyi’s broad generalisations are unsuitable for contemporary analysis, the same basic type of fallacy can be observed in various specific policy settings. Roughly speaking, the thesis comprises two halves. In the first, I focus on theoretical matters, arguing for a consideration of Polanyi specifically as a political economist of ideas. This, I argue, gets us closer to some of Polanyi’s most interesting analytical intentions whilst freeing us from some of the apparent ontological contradictions latent in his various texts. From there, I develop Polanyi’s insights on the role of ideas in capitalist development, foregrounding the notion of economistic fallacy as a key conceptual device. In the second half of the thesis, I apply this analysis over three case studies, one on global financial regulation, one on climate change and one on welfare provision in the UK. These areas are chosen as contemporary reflections of the three ‘fictitious commodities’ that Polanyi identified as uniquely important loci of economistically fallacious logics, namely money, land and labour. In each case, I note how specific versions of economistic fallacy have guided policies that aim to deliver forms of social protection via market mechanisms and market actors – what I call ‘marketised social protection’. This is in distinction to the straightforward (often state-led) societal self-protection that Polanyi and latter-day Polanyians have typically focused upon. I argue that the policies discussed are economistically fallacious to the extent that they rely on unrealistic, overly rationalist assumptions about the nature of society, the natural environment and people, respectively. I show instead how the dynamics of capital accumulation that such regimes serve to legitimate and protect – dynamics that I refer to as forms of ‘market self-protection’ – act to continually undermine the success of such policy programmes. This, I argue, is a distinctive tension in the ideational and material landscape of contemporary capitalism.
|
357 |
Public procurement auctions in BrazilSzerman, Dimitri January 2012 (has links)
This thesis provides an empirical analysis of data generated by ComprasNet, the online procurement bidding platform developed and used by the Brazilian federal government. ComprasNet is a large bidding platform used since 2001 by more than 2200 public purchasing units who list around one million lots each year. Over 70,000 unique bidders have participated in these auctions. In 2010, 46 percent of all procurement for the federal government was conducted through ComprasNet, totaling R$ 27 billion, or 0.7 percent of Brazil’s GDP. In short, these auctions represent a large share of federal tenders and a substantial amount is contracted through them each year. Chapter 1 provides an overview of ComprasNet. After reviewing the literature on various topics which this dissertation contributes to, I describe the institutional background surrounding ComprasNet. I then present the baseline data used throughout the remainder of this dissertation. Chapter 2 addresses one important aspect of designing an online ascending auction, namely how to end the auction. ComprasNet varied its ending rules over time, providing an unique opportunity to test theories of bidder behaviour, as well as assessing the impact of ending rules on auction outcomes. Chapter 3 analyses a two-stage auction format which ComprasNet uses. Two-stage designs have long been proposed by the theoretical literature, but there are virtually no empirical works apart from experimental studies. Finally, chapter 4 analyses a bid preference programme targeted at small and micro enterprises (SMEs). The programme consists of setting aside eligible lots for SMEs. We first use eligibility rules as a source of exogenous variation in the treatment assignment to estimate the effects of the programme on auction outcomes. We then set up an open auction model with endogenous entry and asymmetric bidders and estimate the model’s primitives. In particular, we estimate entry costs, which we interpret as red tape costs.
|
358 |
Essays on markets with frictions : applications to the housing, labour and financial marketsUngerer, Christoph January 2012 (has links)
The classical treatment of market transactions in economics presumes that buyers and sellers engage in transactions instantly and at no cost. In a series of applications in the housing market, the labour market and the market for corporate bonds, this thesis shows that relaxing this assumption has important implications for Macroeconomics and Finance. The first chapter combines theory and empirical evidence to show that search frictions in the housing market imply a housing liquidity channel of monetary policy transmission. Expansionary monetary policy attracts buyers to the housing market, raising housing liquidity. Higher housing sale rates in turn allow lenders to threaten foreclosure more effectively, because the expected carrying costs on foreclosure inventory are lower. Ex-ante, this makes banks willing to offer larger loans, stimulating aggregate demand. The second chapter uses a heterogeneous firm industry model to explore how the macroeconomic response to a temporary employer payroll tax cut depends on the hiring and firing costs faced by firms. Controversially, the presence of non-convex labour adjustment costs suggests that tax cuts create fewer jobs in recessions. When firms hoard labour during downturns, they do not respond to marginal tax cuts by hiring additional workers. The third chapter develops a theory in which trader career concerns generate an endogenous transaction friction. Traders are reluctant to sell assets below historical purchase price, since realizing a loss signals to the employer that the trader is incompetent. The chapter documents empirically several properties of corporate bond transaction data consistent with this theory of career-concerned traders.
|
359 |
Essays in applied macroeconomic theory : volatility, spreads, and unconventional monetary policy toolsVega, Hugo January 2012 (has links)
This thesis contains three essays that employ macroeconomic theory to study the implications of volatility, financial frictions and reserve requirements. The first essay uses an imperfect information model where agents solve a signal extraction problem to study the effect of volatility on the economy. A real business cycle model where the agent faces imperfect information regarding productivity is used to address the question. The main finding is that the variance of the productivity process components has a small negative short run impact on the economy's real variables. However, imperfect information dampens the effects of volatility associated to permanent components of productivity and amplifies the effects of volatility associated to transitory components. The second essay presents a partial equilibrium characterization of the credit market in an economy with partial financial dollarization. Financial frictions (costly state verification and banking regulation restrictions), are introduced and their impact on lending and deposit interest rates denominated in domestic and foreign currency studied. The analysis shows that reserve requirements act as a tax that leads banks to decrease deposit rates, while the wedge between foreign and domestic currency lending rates is decreasing in exchange rate volatility and increasing in the degree of correlation between entrepreneurs' returns and the exchange rate. The third essay introduces an interbank market with two types of private banks and a central bank into a New-Keynesian DSGE model. The model is used to analyse the general equilibrium effects of changes to reserve requirements, while the central bank follows a Taylor rule to set the policy interest rate. The paper shows that changes to reserve requirements have similar effects to interest rate hikes and that both monetary policy tools can be used jointly in order to avoid big swings in the policy rate or a zero bound.
|
360 |
Marketing's role in successful new product development in commercial, investment and merchant banksPavlidis, Panayiotis M. January 1993 (has links)
This thesis investigates marketing's role in new product development (NPD) in commercial, investment and merchant banks. It examines how marketing inputs contribute to new product development success. NPD success can be measured at two levels of analysis - at the program and at the project level. Our study is concentrated at the program level at which sustained product development success is examined, rather than one-off project success. Successful product developers are identified as those banks with a better record of being first to market with new products. This measure of product development success is important in the financial risk management market in which commercial, investment and merchant banks compete fiercely. Based on peer evaluation seventeen banks were identified as innovative, that is to say; active new product developers in the financial risk management market from a universe of almost 130 U.K. and foreign banks with established operations in London. From these seventeen eight participated in our research study. Data was collected in two stages. First, personal interviews were conducted with the heads of the treasury divisions or the heads of derivates desks to collect background information for control purposes. Second, detailed questionnaires were administered to two further members of each bank who were involved with the development of financial risk management products. The questionnaires consisted of statements for which respondents were invited to indicate agreement or disagreement on 5-point Likert type scales. Our findings show that it is not the trappings but the quality of marketing inputs that contribute to program success. Quality of marketing inputs comprises the quality of approach adopted and the quality of execution. The most important finding is that successful product developers adopt higher quality marketing than do less successful product developers. Successful product developers place great emphasis on getting both their approach and their execution right. It was found that successful product developers adopt a market-based approach in identifying new opportunities. They not only adopt a strategy which selects markets on the basis of benefits sought (instead of determining strategy on the basis of primarily internal strengths), but they also use internal marketing to promote this cause. Further, successful product developers possess the appropriate implementation skills to exploit selected opportunities. While we cannot claim that program success will be guaranteed from a market-based approach, our evidence lends strong support that absence of a market-based approach is likely to lead to considerably lesser success in the type of product development investigated in this research study.
|
Page generated in 0.0604 seconds