• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 46
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 71
  • 71
  • 71
  • 16
  • 13
  • 12
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • 8
  • 8
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Jump-diffusion based-simulated expected shortfall (SES) method of correcting value-at-risk (VaR) under-prediction tendencies in stressed economic climate

Magagula, Sibusiso Vusi 05 1900 (has links)
Value-at-Risk (VaR) model fails to predict financial risk accurately especially during financial crises. This is mainly due to the model’s inability to calibrate new market information and the fact that the risk measure is characterised by poor tail risk quantification. An alternative approach which comprises of the Expected Shortfall measure and the Lognormal Jump-Diffusion (LJD) model has been developed to address the aforementioned shortcomings of VaR. This model is called the Simulated-Expected-Shortfall (SES) model. The Maximum Likelihood Estimation (MLE) approach is used in determining the parameters of the LJD model since it’s more reliable and authenticable when compared to other nonconventional parameters estimation approaches mentioned in other literature studies. These parameters are then plugged into the LJD model, which is simulated multiple times in generating the new loss dataset used in the developed model. This SES model is statistically conservative when compared to peers which means it’s more reliable in predicting financial risk especially during a financial crisis. / Statistics / M.Sc. (Statistics)
22

Predicting extreme losses in the South African equity derivatives market

Lourens, Karina 11 June 2014 (has links)
M.Com. (Financial Economics) / This study investigates the best measure of extreme losses in the South African equity derivatives market, and applies this to estimate the size of a default fund for Safcom, the central counterparty (CCP) for exchange-traded derivatives in South Africa. The predictive abilities of historic simulation Value at Risk (VaR), Conditional VaR (CVaR), Extreme VaR (EVaR) calculated using a Generalised Extreme Value (GEV) distribution and stress testing are compared during historic periods of stress in this market. The iterative cumulative sum of squares (ICSS) algorithm of Inclan and Tiao (1994) is applied to identify significant and large, positive shifts in the volatility of returns, thus indicating the start of a stress period. The FTSE/JSE Top 40 Index Future (known as the ALSI future) is used as a proxy for this market. Two key periods of stress are identified, namely the 1997 Asian crisis and the 2008 global financial crisis. The maximum daily losses in the ALSI during these stress periods were observed on 28 October 1997 and 6 October 2008. For the VaR-based loss estimates, 2500 trading days’ returns up to 28 October 1997 and 2750 trading days’ returns up to 6 October 2008 is used. The study finds that Extreme VaR predicts extreme losses during these two historic periods of stress the most accurately and is consequently applied to the quantification of a default fund for Safcom, using 2500 daily returns from 5 June 2003 to 31 May 2013. The EVaR-based estimation of a default fund shows that the current Safcom default fund is sufficient to provide for market losses equivalent to what was suffered during the 2008 global financial crisis, but not sufficient for the magnitude of losses suffered during the 1997 Asian crisis.
23

The influence of risk stakeholder personality on risk framing: an exploratory study

Grobbelaar, Jan January 2016 (has links)
Corporate governance models segregate the role of risk manager and risk taker to allow for independent challenge of risk-related decisions. Numerous studies have demonstrated that broad personality traits predict risk-related behaviour. While prospect theory revealed a natural preference towards risk-taking in a negative risk frame, studies have also shown the influence of personality traits on risk preference. We investigated the less reported subject of the potential influence of risk stakeholder personality on risk decision making in the corporate environment. We expected to observe that the personality traits of risk takers and risk managers will differ as a consequence of occupational self-selection. Further, we expected that such personality differences will produce disparate risk preferences between risk takers and risk managers, supporting the governance expectation of independent challenge of risk-related decisions. A sample of investment banking risk stakeholders (n = 100) completed the HEXACO–PI–R as well as a vignette-based risky choice questionnaire involving positively and negatively framed financial risk scenarios. We found homogeneity in personality traits between risk takers and risk managers but observed a noticeable bias toward risk-taking in the negative frame by risk managers. High Honesty–Humility and Conscientiousness scores in both groups may negate the risk of irresponsible risk-taking or undesirable risk behaviour. The results of this study confirm the importance of personality screening for job applicants and should also alert risk practitioners to potential weaknesses in the independent challenge of risk-related decisions as a result of personality homogeneity among risk stakeholders.
24

The influence of risk stakeholder personality on risk framing: an exploratory study

Grobbelaar, Jan January 2016 (has links)
Corporate governance models segregate the role of risk manager and risk taker to allow for independent challenge of risk-related decisions. Numerous studies have demonstrated that broad personality traits predict risk-related behaviour. While prospect theory revealed a natural preference towards risk-taking in a negative risk frame, studies have also shown the influence of personality traits on risk preference. We investigated the less reported subject of the potential influence of risk stakeholder personality on risk decision making in the corporate environment. We expected to observe that the personality traits of risk takers and risk managers will differ as a consequence of occupational self-selection. Further, we expected that such personality differences will produce disparate risk preferences between risk takers and risk managers, supporting the governance expectation of independent challenge of risk-related decisions. A sample of investment banking risk stakeholders (n = 100) completed the HEXACO–PI–R as well as a vignette-based risky choice questionnaire involving positively and negatively framed financial risk scenarios. We found homogeneity in personality traits between risk takers and risk managers but observed a noticeable bias toward risk-taking in the negative frame by risk managers. High Honesty–Humility and Conscientiousness scores in both groups may negate the risk of irresponsible risk-taking or undesirable risk behaviour. The results of this study confirm the importance of personality screening for job applicants and should also alert risk practitioners to potential weaknesses in the independent challenge of risk-related decisions as a result of personality homogeneity among risk stakeholders.
25

Completion of an incomplete market by quadratic variation assets.

Mgobhozi, S. W. January 2011 (has links)
It is well known that the general geometric L´evy market models are incomplete, except for the geometric Brownian and the geometric Poissonian, but such a market can be completed by enlarging it with power-jump assets as Corcuera and Nualart [12] did on their paper. With the knowledge that an incomplete market due to jumps can be completed, we look at other cases of incompleteness. We will consider incompleteness due to more sources of randomness than tradable assets, transactions costs and stochastic volatility. We will show that such markets are incomplete and propose a way to complete them. By doing this we show that such markets can be completed. In the case of incompleteness due to more randomness than tradable assets, we will enlarge the market using the market’s underlying quadratic variation assets. By doing this we show that the market can be completed. Looking at a market paying transactional costs, which is also an incomplete market model due to indifference between the buyers and sellers price, we will show that a market paying transactional costs as the one given by, Cvitanic and Karatzas [13] can be completed. Empirical findings have shown that the Black and Scholes assumption of constant volatility is inaccurate (see Tompkins [40] for empirical evidence). Volatility is in some sense stochastic, and is divided into two broad classes. The first class being single-factor models, which have only one source of randomness, and are complete markets models. The other class being the multi-factor models in which other random elements are introduced, hence are an incomplete markets models. In this project we look at some commonly used multi-factor models and attempt to complete one of them. / Thesis (M.Sc.)-University of KwaZulu-Natal, Durban, 2011.
26

Risk Measures Extracted from Option Market Data Using Massively Parallel Computing

Zhao, Min 27 April 2011 (has links)
The famous Black-Scholes formula provided the first mathematically sound mechanism to price financial options. It is based on the assumption, that daily random stock returns are identically normally distributed and hence stock prices follow a stochastic process with a constant volatility. Observed prices, at which options trade on the markets, don¡¯t fully support this hypothesis. Options corresponding to different strike prices trade as if they were driven by different volatilities. To capture this so-called volatility smile, we need a more sophisticated option-pricing model assuming that the volatility itself is a random process. The price we have to pay for this stochastic volatility model is that such models are computationally extremely intensive to simulate and hence difficult to fit to observed market prices. This difficulty has severely limited the use of stochastic volatility models in the practice. In this project we propose to overcome the obstacle of computational complexity by executing the simulations in a massively parallel fashion on the graphics processing unit (GPU) of the computer, utilizing its hundreds of parallel processors. We succeed in generating the trillions of random numbers needed to fit a monthly options contract in 3 hours on a desktop computer with a Tesla GPU. This enables us to accurately price any derivative security based on the same underlying stock. In addition, our method also allows extracting quantitative measures of the riskiness of the underlying stock that are implied by the views of the forward-looking traders on the option markets.
27

Agency costs and accounting quality within an all-equity setting: the role of free cash flows and growth opportunities

Unknown Date (has links)
I investigate if all-equity firms are a heterogeneous group as it relates to agency costs and accounting quality. All-equity firms are a unique group of firms that choose a “corner solution” as their capital structure. Extant research, supported by well-established theories such as trade-off theory, free cash flow theory, and Jensen’s (1986) control hypothesis, generally conclude that agency conflicts motivate such structure. Research also supports the alternative argument that poor accounting quality makes debt so prohibitive that such firms are driven to this capital structure. I propose that an all-equity structure is not necessarily symptomatic of agency conflicts and poor accounting quality overall. I investigate if different motivations, within an all-equity setting, reflected by free cash flows and growth opportunities, result in different levels of agency cost and accounting quality. By anchoring on theories that link implicit costs of debt to free cash flow levels and growth opportunities, I hypothesize that free cash flows and growth opportunities are strongly linked to the justification or lack thereof for the pursuit of such strategy. I hypothesize and show that firms in the extremes of the free cash flow to growth rate spectrum exhibit significantly different levels of agency cost and accounting quality within the all-equity setting. These results support my main prediction that there exists agency costs and accounting quality differences within the all-equity setting which are associated with free cash flow levels and growth opportunities and that the pessimistic conclusions for pursuing an all-equity strategy reached by prior research should not be generalized to all such firms. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2015 / FAU Electronic Theses and Dissertations Collection
28

Risk dynamics, growth options, and financial leverage: evidence from mergers and acquisitions

Unknown Date (has links)
In essay I, I empirically examine theoretical inferences of real options models regarding the effects of business risk on the pricing of firms engaged in corporate control transactions. This study shows that the risk differential between the merging firms has a significant effect on the risk dynamic of bidding firms around control transactions and that the at-announcement risk dynamic is negatively related to that in the preannouncement period. In addition, the relative size of the target, the volatility of bidder cash flows, and the relative growth rate of the bidder have significant explanatory power in the cross-section of announcement returns to bidding firm shareholders as does the change in the cost of capital resulting from the transaction. Essay II provides an empirical analysis of a second set of real options models that theoretically examine the dynamics of financial risk around control transactions as well as the link between financial leverage and the probability of acquisition. In addition, I present a comparison of the financial risk dynamics of firms that choose an external growth strategy, through acquisition, and those that pursue an internal growth strategy through capital expenditures that are unrelated to acquisition. / by Jeffrey M. Coy. / Thesis (Ph.D.)--Florida Atlantic University, 2013. / Includes bibliography. / Mode of access: World Wide Web. / System requirements: Adobe Reader.
29

Revisiting the methodology and application of Value-at-Risk

Unknown Date (has links)
The main objective of this thesis is to simulate, evaluate and discuss three standard methodologies of calculating Value-at-Risk (VaR) : Historical simulation, the Variance-covariance method and Monte Carlo simulations. Historical simulation is the most common nonparametric method. The Variance-covariance and Monte Carlo simulations are widely used parametric methods. This thesis defines the three aforementioned VaR methodologies, and uses each to calculate 1-day VaR for a hypothetical portfolio through MATLAB simulations. The evaluation of the results shows that historical simulation yields the most reliable 1-day VaR for the hypothetical portfolio under extreme market conditions. Finally, this paper concludes with a suggestion for further studies : a heavy-tail distribution should be used in order to imporve the accuracy of the results for the two parametric methods used in this study. / by Kyong Chung. / Thesis (M.S.)--Florida Atlantic University, 2012. / Includes bibliography. / Mode of access: World Wide Web. / System requirements: Adobe Reader.
30

Governing the Economy at the Limits of Neoliberalism: The Genealogy of Systemic Risk Regulation in the United States, 1922-2012

Ozgode, Onur January 2015 (has links)
This dissertation traces the genealogy of systemic risk as a pathology of monetary government of the economy and systemic risk regulation as a regulatory regime to govern this governmental problem as instituted under the Dodd-Frank Wall Street Reform Act of 2010. Using resilience and vulnerability as diagnostic categories, it reconstructs the history of economic government since the New Deal as a recursive problem-solving process, plagued with negative feedback loops. It shows how different groups of experts, acting as policy entrepreneurs, problematized and framed the economy as a crisis-prone system and how they tried to reduce the catastrophe risk in the economy without restricting economic activity and growth. In doing this, the dissertation details the proposals as well as the actual governmental apparatuses set up to represent and format the economy. It argues that systemic risk regulation emerges at the intersection of two distinct, but historically interrelated genealogical threads, systemic risk and vulnerability reduction. It shows that while systemic risk has been articulated in different ways since the 1920s, its emergence in its contemporary form took place with the rise of the monetary government in the 1970s. Under monetary government, the financial system was reformatted as a vital credit-supply infrastructure that functioned as a monetary policy transmission mechanism. A critical aspect of this reformatting was the cultivation of an increasingly leveraged financial system that relied on short-term lending markets for operational liquidity. The outcome of this development, in turn, was the reframing of systemic risk as the catastrophe risk that the failure of a firm participating in these markets would result in a system-wide collapse and thereby a depression. Vulnerability reduction, in contrast, was conceived by a group of experts working in New Deal resource planning agencies between the early 1930s and the mid-1950s. This governmental technology was concerned with the resilience of the economic system to low probability but high impact macroeconomic shocks. Within this governmental strategy, the primary objective was to reduce the vulnerability of certain points of interdependence that were considered to be critical and strategic nodes within the economic system. The dissertation argues that the rise of systemic risk regulation signifies the convergence of systemic risk and vulnerability reduction for the first time since these two genealogical threads were separated in the post-Truman period. In this respect, this development points to the remapping of vulnerability reduction onto financial ontology of substantive credit flows and thus the rearticulation of monetary government with systemic tools such as network and catastrophe modeling in a substantive form.

Page generated in 0.1235 seconds