• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 139
  • 4
  • 4
  • 2
  • Tagged with
  • 186
  • 186
  • 29
  • 23
  • 15
  • 15
  • 13
  • 13
  • 13
  • 13
  • 12
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Break-even volatility for caps, floors and swaptions

Cresswell, Wade 02 March 2020 (has links)
This dissertation investigates break-even volatility in the context of the South African interest rate market. Introduced by Dupire (2006), break-even volatility is a retrospective measure defined as the volatility that ensures the profit or loss from a delta hedged option position is zero. Break-even volatility sheds light on the inner structure of the market and is a promising investigatory tool. Insurance houses in South Africa are interested in modelling long-dated interest rate derivatives embedded within their liabilities. In pursuit of this goal, some are currently calibrating the Lognormal Forward-LIBOR Market Model to market prices. They rarely directly trade in said derivatives, but merely delta hedge their risk daily. In this case, break-even volatility surfaces become more relevant than recovering market prices (which incorporate the banks risk premium and profit margin) as it should better represent the historical cost of replicating the option under consideration. This dissertation ultimately assesses the use of the Lognormal Forward-LIBOR Market Model in the South African interest rate market using break-even volatility. It is found that several caps and swaptions are trading at volatilities that differ significantly from their break-even volatility estimates. Furthermore, through an investigation into the calibration of the Lognormal Forward-LIBOR Market Model to break-even volatilities, an argument that the underlying dynamics of the model are incompatible with that of the South African interest rate market is developed.
102

Pricing multi-asset options in exponential levy models

Endekovski, Jessica 02 March 2020 (has links)
This dissertation looks at implementing exponential Levy models whereby the un- ´ derlyings are driven by Levy processes, which are able to account for stylised facts ´ that traditional models do not, in order to price basket options more efficiently. In particular, two exponential Levy models are implemented and tested: the multi- ´ variate Variance Gamma (VG) model and the multivariate normal inverse Gaussian (NIG) model. Both models are calibrated to real market data and then used to price basket options, where the underlyings are the constituents of the KBW Bank Index. Two pricing methods are also compared: a closed-form (analytical) approximation of the price, derived by Linders and Stassen (2016) and the standard Monte Carlo method. The convergence of the analytical approximation to Monte Carlo prices was found to improve as the time to maturity of the option increased. In comparison to real market data, the multivariate NIG model was able to fit the data more accurately for shorter maturities and the multivariate VG model for longer maturities. However, when looking at Monte Carlo prices, the multivariate VG model was found to outperform the results of the multivariate NIG model, as it was able to converge to Monte Carlo prices to a greater degree.
103

A Descriptive analysis of the various sources of portfolio risk on the Namibian Stock Market

Uzera, Nehemia Puuaapo January 2010 (has links)
We conducted a study on the terrestrial small mammal communities (< 1kg) in the Volcanoes National Park (VNP), Rwanda, to determine species diversity and altitudinal/habitat associations. Data on environmental variables (habitat cover, temperature, wind speed and rainfall) were incorporated into the analysis. Both Sherman live and snap traps were set in transects from 30 September to 8 November 2009 at eight habitats (ranging from 2380 m to 3710 m). Trapping over 4800 trap nights resulted in the capture of 305 individuals (including 4 recaptures), of which 247 were identified to species level. These represented eight species of rodents, three species of shrews and one mongoose. Total numbers of small mammals were high in brush ridge and herbaceous habitats, and low in alpine and bamboo habitats. The midaltitude zone housed a high number of small mammals. Of the species captured, Praomys degraaffi is vulnerable and Sylvisorex vulcanorum is near threatened (IUCN 2009); six species (Hylomyscus vulcanorum, Mus bufo, Praomys degraaffi, Sylvisorex vulcanorum, Lophuromys woosnami and Trachyoryctes ruandae) are endemic to the Albertine rift; and four species are new to the Park list. Species richness varied significantly among the different habitat types. Species richness and diversity increased with elevation up to the middle altitudes (2860-3255 m) and then declined with increasing elevation. Endemic species were found mainly in low and middle attitude habitats, and thus, these habitat types are important for conservation of small mammals at VNP. The numbers of known small mammal endemics for VNP will probably be increased if trapping is done seasonally and a more diverse regime of trapping techniques is employed. Key words: Rodentia, Soricidae, endemism, Volcanoes NP, species diversity.
104

Robust portfolio construction controlling the alpha-weight angle

Bailey, Geraldine January 2013 (has links)
Includes abstract. / Includes bibliographical references. / Estimation risk is widely seen to have a significant impact on mean-variance portfolios and is one of the major reasons the standard Markowitz theory has been criticized in practice. While several attempts to incorporate estimation risk has been considered in the past, the approach by of Golts and Jones (2009) represents an innovative approach to incorporate estimation risk in the sample estimates of the input returns and covariance matrix. In this project we discuss the theory introduced by Golts and Jones (2009) which looks at the direction and the magnitude of the vector of optimal weight and investigates them separately, with focus on the former. We demystify the theory of the authors with focus on both mathematical reasoning and practical application. We show that the distortions of the mean-variance optimization process can be quantified by considering the angle between the vector of expected returns and the vector of optimized portfolio positions. Golts and Jones (2009) call this the alpha-weight angle. We show how to control this angle by employing robust optimization techniques, which we also explore as a main focus in this project. We apply this theory to the South African market and show that we can indeed obtain portfolios with lower risk statistics especially so in times of economic crisis.
105

Modelling illiquid volatility skews

Crowther, Servaas Marcus January 2014 (has links)
Includes bibliographical references. / Most markets trade liquidly in options on the market index, in fact they often trade at a wide range of strike levels. Thus, using the Black-Scholes model, we can obtain the implied volatilities at the various strike levels, forming the associated implied volatility skew of the respective market under consideration. This, however, is not always feasible when it comes to the individual stocks within the market, as single stock options trade a lot less frequently. This dissertation makes use of data from the Eurozone, in particular we consider the Euro Stoxx 50 market index and its underlying constituents. Options written on the Euro Stoxx 50 and its constituents are highly liquid, and volatility skews are obtained for the market as well as for most of the single stocks within the market. I then artificially created 3 cases of illiquid markets, each with increasing degrees of sparseness mimicking various possible realities. Using principal component analysis, this dissertation aims to find an appropriate model for relating the volatility skew of the index to that of single stocks within the market in order to fill gaps in the data of the skews of the individual stocks. Results indicate that simpler models perform similarly in all scenarios of sparse- ness whereas the performance of more complex models decrease as the data becomes sparser. This indicates that basic relationships can be formed between the index and single stocks in cases with relatively low levels of trade in the market but more accurate estimates are more difficult to achieve. However, if we use the skew data, as is, as an input to the models, their performance remains by and far the same using the full data set and using monthly information. This is encouraging, as it means we can fill gaps in the individual stocks' skew data with as good a fit as if we modeled with a full set of data.
106

Monte Carlo methods for the estimation of value-at-risk and related risk measures

Marks, Dean January 2011 (has links)
Nested Monte Carlo is a computationally expensive exercise. The main contributions we present in this thesis are the formulation of efficient algorithms to perform nested Monte Carlo for the estimation of Value-at-Risk and Expected-Tail-Loss. The algorithms are designed to take advantage of multiprocessing computer architecture by performing computational tasks in parallel. Through numerical experiments we show that our algorithms can improve efficiency in the sense of reducing mean-squared error.
107

Empirical Analysis ot the Top 800 Cryptocurrencies using Machine Learning Techniques

Riedl, Anna Teresa 14 February 2020 (has links)
The International Token Classification (ITC) Framework by the Blockchain Center in Frankfurt classifies 795 cryptocurrency tokens based on their economic, technological, legal and industry categorization. This work analyzes cryptocurrency data to evaluate the categorization with real-world market data. The feature space includes price, volume and market capitalization data. Additional metrics such as the moving average and the relative strengh index are added to get a more in-depth understanding of market movements. The data set is used to build supervised and unsupervised machine learning models. The prediction accuracies varied amongst labels and all remained below 90%. The technological label had the highest prediction accuracy at 88.9% using Random Forests. The economic label could be predicted with an accuracy of 81.7% using K-Nearest Neighbors. The classification using machine learning techniques is not yet accurate enough to automate the classification process. But it can be improved by adding additional features. The unsupervised clustering shows that there are more layers to the data that can be added to the ITC. The additional categories are built upon a combination of token mining, maximal supply, volume and market capitalization data. As a result we suggest that a data-driven extension of the categorization in to a token profile would allow investors and regulators to gain a deeper understanding of token performance, maturity and usage.
108

Portfolio constuction using robust weight functions

Mvubu, Thokozani January 2010 (has links)
The Markowitz portfolio selection model has formed the foundation from which all the other portfolio selection models are formulated. The Sharpe Single Index and the Improved Sharpe Single Index models have been formulated in a bid to form better performing models. In the optimization algorithms, these models tend to not select highly volatile shares and thus eliminate the possibility of making better returns in the event these shares perform very well. The Huber and Tukey Bisquare weights are considered in this project to enhance these models in capturing these outlying observations. The Huber weights in the Improved Sharpe (Troskie-Hossain) Single Index model are found to be giving a better and more realistic optimal portfolio compared to the Sharpe Single Index model. 2
109

Volatility forecasting using Double-Markov switching GARCH models under skewed Student-t distribution

Mazviona, Batsirai Winmore January 2012 (has links)
Includes bibliographical references. / This thesis focuses on forecasting the volatility of daily returns using a double Markov switching GARCH model with a skewed Student-t error distribution. The model was applied to individual shares obtained from the Johannesburg Stock Exchange (JSE). The Bayesian approach which uses Markov Chain Monte Carlo was used to estimate the unknown parameters in the model. The double Markov switching GARCH model was compared to a GARCH(1,1) model. Value at risk thresholds and violations ratios were computed leading to the ranking of the GARCH and double Markov switching GARCH models. The results showed that double Markov switching GARCH model performs similarly to the GARCH model based on the ranking technique employed in this thesis.
110

Pricing index-linked catastrophe bonds via Monte Carlo simulation

Van der Merwe, Justin January 2016 (has links)
The pricing framework used in this dissertation allows for the specification of catastrophe risk under the real-world measure. This gives the user a great deal of freedom in the assumptions made about the underlying catastrophe risk process (referred to in this dissertation as the aggregate loss process). Therefore, this dissertation aims to shed light on the effect of various assumptions and considerations on index-linked CAT bond prices based on the Property Claims Services (PCS) index. Also, given the lack of a closed-form solution to the pricing formulae used and the lack of a liquidly-traded secondary market, this dissertation compares two approximation methods to evaluate expressions involving the aggregate loss process: Monte Carlo simulation and a mixed-approximation method. The two price-approximation methods are largely consistent and seem to agree particularly in the upper quantiles of the distribution of the aggregate loss process. Another key consideration is that the third-party estimating the catastrophe losses in North America, PCS, only records catastrophe losses above $25 million. This dissertation therefore also explores the issue of left-truncated data and its effect when estimating the parameters of the aggregate loss process. For this purpose, it introduces a non-parametric approach to compare, in sample, the results of ignoring the threshold and taking it into account. In both these exercises, it becomes apparent that very heavy-tailed distributions need to be used with caution. In the former case, the use of very heavy-tailed distributions places restrictions on the distributions that can be used for the mixed-approximation method. Finally, as a more realistic avenue this dissertation proposes a simple stochastic intensity model to compare with the deterministic intensity model and found that, by parsimony, the deterministic intensity seems to provide a reasonable model for the upper quantiles of the aggregate loss process. The key results of this dissertation are that the pricing of CAT bonds depends on the quantiles of the aggregate loss process, as in evident both when comparing the approximation methods and the deterministic and stochastic intensity functions, and that left-truncation should be taken into account when valuing index-linked CAT bonds using data from PCS.

Page generated in 0.135 seconds