• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 139
  • 4
  • 4
  • 2
  • Tagged with
  • 186
  • 186
  • 29
  • 23
  • 15
  • 15
  • 13
  • 13
  • 13
  • 13
  • 12
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Value-add in technical analysis on the JSE Bond Market

Haddad, Zavier January 2017 (has links)
Trading on the JSE Bond Market is still done in an archaic fashion when compared to the highly digitalised trading done within the equities markets in South Africa, indicating there is less market efficiency within bond trading. Technical analysis relies on market inefficiencies to achieve an informational advantage and so there could be technical analysis based trading opportunities within bond trading. Bollinger Bands are one of the more prominent technical analysis methods. In this dissertation they are used in trading simulations to generate buy and sell signals in order to test if there is any value-add in their implementation. The dissertation attempts improve Bollinger Band based trading in two ways. The first involves attempts to more accurately estimate the underlying distribution of the time series, that is assumed to be normal in the standard methodology. It is shown that no additional benefit is derived from the alternative distribution estimation methods. Bollinger Bands make an assumption of stationarity on the time series on which they are implimented and so the second attempt at improved accuracy addresses this notion. Cointegration is used to generate linear combinations of bonds that are stationary, leading to more accurate application of the Bollinger Bands. The stationary combination of bonds produces positive results from the trading simulations, primarily within the combinations that are generated from a linear combination of less bonds and that posses larger variation. Not considering the liquidity assumtions, the positive results show that there is value-add within specific technical analysis based trading strategies.
112

A review of current Rough Volatility Methods

Beelders, Noah 31 January 2022 (has links)
Recent literature has provided empirical evidence showing that the behaviour of volatility in financial markets is rough. Given the complicated nature of rough dynamics, a review of these methods is presented with the intention of ensuring tractability for those wishing to implement these techniques. The models of rough dynamics are built upon the fractional Brownian Motion and its associated powerlaw kernel. One such model is called the Rough Heston, an extension of the Classical Heston model, and is the main model of focus for this dissertation. To implement the Rough Heston, fractional Riccati ordinary differential equations (ODEs) must be solved; and this requires numerical methods. Three such methods in order of increasing complexity are considered. Using the fractional Adam's numerical method, the Rough Heston model can be effected to produce realistic volatility smiles comparable to that of market data. Lastly, a quick and easy approximation of the Rough Heston model, called the Poor Man's Heston, is discussed and implemented.
113

Calibrating the Hurst Parameter for Rough Volatility Models with Application in the South African Market

Pettit, Paul 14 April 2023 (has links) (PDF)
It is known that accurate and efficient calibration of any fractional stochastic volatility model is important for trading and risk management purposes. Under the rough Heston model proposed by El Euch et al. (2019), the Hurst parameter governs the roughness of the volatility process. This dissertation explores the different calibration methods used to obtain an estimate for the Hurst parameter, under the scope of the rough Heston model. Three different calibration methods are presented, namely, a Brute Force minimisation procedure, a Neural Network calibration and a Linear Regression procedure. European option prices are simulated from the rough Heston model using the characteristic function pricing approach as in El Euch and Rosenbaum (2019) and numerical techniques, such as the fractional Adams method which are implemented in MATLAB. These simulated prices are then used to test and compare the three proposed calibration methods in terms of accuracy and efficiency. Thereafter, additional experiments are conducted on South African market data from traded options and the fitted models are compared across the calibration methods used. The results of our numerical experiments are used to justify the nature of rough volatility in the South African options market and recommendations are made on the appropriateness of each calibration scheme in practice. Overall, we find that the performance measured by accuracy on our simulated data of the Neural Network method is similar to the Brute Force minimisation method, whereas the Linear Regression method, is the least accurate. When calibrating on the market data, the results of the fitted models show that both the Neural Network and Brute Force method resembles the market behaviour. All three methods were shown to be suitable in estimating the Hurst parameter and suggesting rough volatility in this South African market.
114

Gaussian Process Regression for Option Pricing and Hedging

Patel, Ishani 14 April 2023 (has links) (PDF)
Recent literature in the field of quantitative finance has employed machine learning methods to speed up typical numerical calculations including derivative pricing, fitting Greek profiles, constructing volatility surfaces and modelling counterparty credit risk, to name a few. This dissertation aims to investigate the accuracy and efficiency of Gaussian process regression (GPR) compared to traditional quantitative pricing algorithms. The GPR algorithm is applied to pricing a down-and-out barrier call option. Notably, Crepey and Dixon ´ (2019) propose an alternative method for computing the Gaussian process Greeks by directly differentiating the GPR option pricing model. Based on their approach, the GPR algorithm is further extended to compute the delta and vega of the option. Numerical experiments display that option pricing accuracy scores are within a tolerable range and demonstrate increased speed of considerable magnitudes with speed-up factors in the 1 000s. Computing the Greeks convey favourable computational properties; however, the GPR model struggles to obtain accurate predictions for the delta and vega. The trade-off between accuracy and speed is further investigated, where the inclusion of additional GPR input parameters hinder performance metrics whilst a larger training data set improves model accuracy.
115

An Application of Deep Hedging in Pricing and Hedging Caplets on the Prime Lending Rate

Patel, Keyur 14 April 2023 (has links) (PDF)
Derivatives in South Africa are traded via an exchange, such as the JSE's derivatives markets, or over-the-counter (OTC). This dissertation focuses on the pricing and hedging of caplets written on the South African prime lending rate. In a complete market, caplets can be continuously hedged with zero risk. However, in the particular case of caplets written on the prime lending rate, market completeness ceases to exist. This is because the prime lending rate is a benchmark for retail lending and is not tradeable, in general. Since parametric models may not be specified and calibrated for such incomplete markets, the aim of this dissertation is to consider the deep hedging approach of Buehler et al. (2019) for pricing and hedging such a derivative. First, a model dependent approach is taken to set a benchmark level of performance. This approach is derived using techniques outlined in West (2008) which rely heavily on interest rate pairs being cointegrated to use the market standard Black (1976) model. Thereafter, the deep hedging approach is considered in which a neural network is set up and used to price and hedge the caplets. The deep hedging approach performs at least as well as the model dependent approach. Furthermore, the deep hedging approach can also be used to recover a volatility skew which is in fact, needed as an input in the model dependent approach. The approach has certain downsides to it: a rich set of historical data is required and it is more time consuming to conduct than the model dependent approach. The deep hedging approach, in this specific implementation, also has a limitation that only one hedge instrument is used. When this limitation is also applied to the model dependent approach, the deep hedging approach performs better in all cases. Therefore, deep hedging proves to be a sufficient alternative to pricing and hedging caplets on the prime lending rate in an incomplete market setting.
116

South African Inflation Modelling Under the HJM Framework

Rizzo, Massimo 20 April 2023 (has links) (PDF)
Inflation modelling is typically done following an econometric approach, however this results in models being constructed that are not consistent with the observable bond market and as such they cannot be used in hedging market instruments or in pricing inflation-linked derivatives. Jarrow and Yildirim (2003) were one of the first to propose a framework under which nominal and real forward rates and an inflation index could be jointly modelled in a consistent manner, based on the Heath-Jarrow-Morton (HJM) framework as first developed by Heath et al. (1992). They showed that under this framework it is possible to recover observed nominal and inflation-linked bond prices, hedge these instruments, and price related inflation-linked derivatives. A shortfall of this framework however, as critiqued by Mercurio (2005) and Belgrade et al. (2004), is that it depends entirely on non observable parameters. As such, estimating the parameters of a model constructed under this framework is non-trivial. This dissertation applies the approach detailed by Jarrow and Yildirim (2003) to construct a model that fits the South African context, and makes use of the Kalman filter, as originally documented by Kalman (1960), to overcome the issues that arise in parameter estimation. Using the model constructed, forecasts of future inflation in South Africa are produced.
117

A study on the effect of dilutions and buybacks on the pricing of equity and stock based claims using a finite difference mesh

Boynton, Matthew 27 June 2023 (has links) (PDF)
We study a model of the firm, with perpetual debt and a continuously payable coupon as well as the possibility to raise cash via equity issuance. Excess cash is paid back to shareholders either via dividends or via buybacks. The number of shares changes when equity is issued and when the firm buys back shares. Using this model we track the total number of shares in issue. Then we use finite difference methods to investigate the differences in pricing options on a fixed portion of equity and options linked to the share price, as well as implications for American options on equity.
118

Analysis of equity and interest rate returns in South Africa under the context of jump diffusion processes

Mongwe, Wilson Tsakane January 2015 (has links)
Includes bibliographical references / Over the last few decades, there has been vast interest in the modelling of asset returns using jump diffusion processes. This was in part as a result of the realisation that the standard diffusion processes, which do not allow for jumps, were not able to capture the stylized facts that return distributions are leptokurtic and have heavy tails. Although jump diffusion models have been identified as being useful to capture these stylized facts, there has not been consensus as to how these jump diffusion models should be calibrated. This dissertation tackles this calibration issue by considering the basic jump diffusion model of Merton (197G) applied to South African equity and interest rate market data. As there is little access to frequently updated volatility surfaces and option price data in South Africa, the calibration methods that are used in this dissertation are those that require historical returns data only. The methods used are the standard Maximum Likelihood Estimation (MLE) approach, the likelihood profiling method of Honore (1998), the Method of Moments Estimation (MME) technique and the Expectation Maximisation (EM) algorithm. The calibration methods are applied to both simulated and empirical returns data. The simulation and empirical studies show that the standard MLE approach sometimes produces estimators which are not reliable as they are biased and have wide confidence intervals. This is because the likelihood function required for the implementation of the MLE method is not bounded. In the simulation studies, the MME approach produces results which do not make statistical sense, such as negative variances, and is thus not used in the empirical analysis. The best method for calibrating the jump diffusion model to the empirical data is chosen by comparing the width of the bootstrap confidence intervals of the estimators produced by the methods. The empirical analysis indicates that the best method for calibrating equity returns is the EM approach and the best method for calibrating interest rate returns is the likelihood profiling method of Honore (1998).
119

A stochastic partial differential equation approach to mortgage backed securities

Ahmad, Ferhana January 2012 (has links)
The market for mortgage backed securities (MBS) was active and fast growing from the issuance of the first MBS in 1981. This enabled financial firms to transform risky individual mortgages into liquid and tradable market instruments. The subprime mortgage crisis of 2007 shows the need for a better understanding and development of mathematical models for these securities. The aim of this thesis is to develop a model for MBS that is flexible enough to capture both regular and subprime MBS. The thesis considers two models, one for a single mortgage in an intensity based framework and the second for mortgage backed securities using a stochastic partial differential equation approach. In the model for a single mortgage, we capture the prepayment and default incentives of the borrower using intensity processes. Using the minimum of the two intensity processes, we develop a nonlinear equation for the mortgage rate and solve it numerically and present some case studies. In modelling of an MBS in a structural framework using stochastic PDEs (SPDEs), we consider a large number of individuals in a mortgage pool and assume that the wealth of each individual follows a stochastic process, driven by two Brownian mo- tions, one capturing the idiosyncratic noise of each individual and the second a common market factor. By defining the empirical measure of a large pool of these individuals we study the evolution of the limit empirical measure and derive an SPDE for the evolution of the density of the limit empirical measure. We numerically solve the SPDE to demonstrate its flexibility in different market environments. The calibration of the model to financial data is the focus of the final part of thesis. We discuss the different parameters and demonstrate how many can be fitted to observed data. Finally, for the key model parameters, we present a strategy to estimate them given observations of the loss function and use this to determine implied model parameters of ABX.HE.
120

Liquidity Modeling Using Order Book Data

Li, Yi 31 August 2009 (has links)
"On a stock exchange, trading activity has an impact on stock prices. Market agents place limit orders, which come in the form of bids and asks. These orders wait in the market to be executed when another agent agrees to fulfill the transaction. We examine an "inventory-based" quoting strategy model developed by Marco Avellaneda and Sasha Stoikov. We expand on their work by developing a method to calibrate the model to market data using limit order data provided by Morgan Stanley. We consider solving a least squares problem which fits the model to the data using a sensitivity parameter."

Page generated in 0.0973 seconds