• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 139
  • 4
  • 4
  • 2
  • Tagged with
  • 186
  • 186
  • 29
  • 23
  • 15
  • 15
  • 13
  • 13
  • 13
  • 13
  • 12
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Currency trios - using geometric concepts to visualise and interpret relationships between currencies

Davidson, Abby January 2016 (has links)
A currency trio is a set of three currencies and their respective exchange rates, which have a relationship fixed by a triangular arbitrage condition. This condition forms the basis for the derivation of a geometric interpretation of the relationships between the exchange rates. In the geometric framework, the three currencies in a currency trio are represented by a triangle, where each of the vertices represents a currency. The volatilities of the exchange rates are represented by the lengths of the sides joining the respective currencies and the cosine of each angle represents the correlation between the two exchange rates depicted by the angle's adjacent sides. The geometric approach is particularly useful when dealing with implied data as it allows the calculation of implied correlation using implied volatility. This is valuable as implied volatility is frequently quoted in the foreign exchange market; whereas, implied correlation is not directly quoted and is more difficult to extract from market data. This dissertation aims to thoroughly investigate the geometric framework and use it to visualise and interpret the relationships between currencies in a currency trio. The analysis will initially look at currency trios with realised spot data before moving on to implied data. In the implied data context, the framework will be used to extract and evaluate implied correlation estimates using implied volatility data extracted from the foreign exchange market. The framework will be extended to investigate whether an illiquid option can be proxy hedged using options on the two other currencies in a currency trio. Finally, the findings will be discussed and the feasibility of the applications of the framework will be considered.
62

Fourier pricing of two-asset options: a comparison of methods

Roberts, Jessica Ellen January 2018 (has links)
Fourier methods form an integral part in the universe of option pricing due to their speed, accuracy and diversity of use. Two types of methods that are extensively used are fast Fourier transform (FFT) methods and the Fourier-cosine series expansion (COS) method. Since its introduction the COS method has been seen to be more efficient in terms of rate of convergence than its FFT counterparts when pricing vanilla options; however limited comparison has been performed for more exotic options and under varying model assumptions. This paper will expand on this research by considering the efficiency of the two methods when applied to spread and worst-of rainbow options under two different models - namely the Black-Scholes model and the Variance Gamma model. In order to conduct this comparison, this paper considers each option under each model and determines the number of terms until the price estimate converges to a certain level of accuracy. Furthermore, it tests the robustness of the pricing methodologies to changes in certain discretionary parameters. It is found that although under the Black-Scholes model the COS method converges in fewer terms than the FFT method for both spread options (32 versus 128 terms) and the rainbow options (64 versus 512 terms), this is not the case under the more complex Variance Gamma model where the terms to convergence of both methods are similar. Both the methodologies are generally robust against changes in the discretionary variables; however, a notable issue appears under the implementation of the FFT methodology to worst-of rainbow options where the choice of the truncated integration region becomes highly influential on the ability of the method to price accurately. In sum, this paper finds that the improved speed of the COS method against the FFT method diminishes with a more complex model - although the extent of this can only be determined by testing for increasingly complex characteristic functions. Overall the COS method can be seen to be preferable from a practical point of view due to its higher level of robustness.
63

Quantifying the impact of adding an unlisted credit asset to a portfolio of listed credit assets

Makhuvha, Vuyo January 2017 (has links)
Skilled construction workers play a vital role in the delivery of construction projects. However, there has been report off their shortage within the Nigerian construction industry. The commitment of the few available ones to their organisation is therefore important as this is bound to influence the service delivery of these organisations. It is based on this knowledge that this study assessed the commitment of skilled construction workers in Abuja, Nigeria. The study adopted a survey design and quantitative data were gathered from skilled construction workers in registered construction companies in the study area. Percentage, mean item score, and Kruskal-Walis H-Test were used in analysing the data gathered. The study revealed that the type of employment of skilled workers (full time, part-time or contract) has no significant relationship with their commitment type. The common type of commitment exhibited is the continuance commitment. The most significant factors influencing their commitment are; getting feedback from supervisors, payment received being equal to work done, and availability of opportunities to grow. The practical implication of this result is that construction companies within the country need to improve in the aspect of human resource management so as to attain better commitment and at the same time improve their productivity. It is believed that the findings of this study will assist construction organisations in making appropriate planning and developing significant methods that will assist them in enhancing their organisational commitment of their skilled workers, and by so doing increasing organisational performance and workers productivity.
64

Analysis of CDO tranche valuation and the 2008 credit crisis

Muzenda, Nevison January 2013 (has links)
Includes bibliographical references. / The causes of the 2008 financial crisis were wide ranging. Some financial commentators have suggested there were significant inadequacies in the models used to price complex derivatives such as synthetic Collaterilised Debt Obligations (CDOs). We discuss the technical properties of CDOs and the modeling approaches used by CDO traders and the watchdog credit rating agencies. We look at how the pricing models fared before and during the financial crisis. Comparing our model prices to market synthetic CDO prices, we investigate how well these pricing models captured the underlying financial risks of trading in CDOs.
65

Accelerated Adjoint Algorithmic Differentiation with Applications in Finance

De Beer, Jarred January 2017 (has links)
Adjoint Differentiation's (AD) ability to calculate Greeks efficiently and to machine precision while scaling in constant time to the number of input variables is attractive for calibration and hedging where frequent calculations are required. Algorithmic adjoint differentiation tools automatically generates derivative code and provide interesting challenges in both Computer Science and Mathematics. In this dissertation we focus on a manual implementation with particular emphasis on parallel processing using Graphics Processing Units (GPUs) to accelerate run times. Adjoint differentiation is applied to a Call on Max rainbow option with 3 underlying assets in a Monte Carlo environment. Assets are driven by the Heston stochastic volatility model and implemented using the Milstein discretisation scheme with truncation. The price is calculated along with Deltas and Vegas for each asset, at a total of 6 sensitivities. The application achieves favourable levels of parallelism on all three dimensions implemented by the GPU: Instruction Level Parallelism (ILP), Thread level parallelism (TLP), and Single Instruction Multiple Data (SIMD). We estimate the forward pass of the Milstein discretisation contains an ILP of 3.57 which is between the average range of 2-4. Monte Carlo simulations are embarrassingly parallel and are capable of achieving a high level of concurrency. However, in this context a single kernel running at low occupancy can perform better with a combination of Shared memory, vectorized data structures and a high register count per thread. Run time on the Intel Xeon CPU with 501 760 paths and 360 time steps takes 48.801 seconds. The GT950 Maxwell GPU completed in 0.115 seconds, achieving an 422⇥ speedup and a throughput of 13 million paths per second. The K40 is capable of achieving better performance.
66

Testing adaptive market efficiency under the assumption of stochastic volatility

Holder, Nicole January 2017 (has links)
This dissertation explores the adaptive market hypothesis (AMH) first proposed by Lo (2004) which incorporates the efficient market hypothesis (EMH) of Malkiel and Fama (1970) and its behavioural exceptions. The AMH differs from the EMH, in that it assumes that the efficiency level of a market can fluctuate over time, whereas the EMH does not. The original test of evolving efficiency (TEE) was developed by Emerson et al. (1997) and Zalewska-Mitura and Hall (1999) and has an underlying GARCH-M model. Later, the generalised test of evolving efficiency (GTEE) was developed by Kulikova and Talyor (in progress), which has an underlying stochastic GARCH-M model proposed by Hall (1991). In this dissertation, the stochastic volatility test of evolving efficiency (SV-TEE) is developed using an underlying Stochastic Volatility-in-Mean (SVM) model introduced by Koopman and Uspensky (2002). The QMLE technique introduced by Harvey (1989) and the classical and Extended Kalman Filter techniques are described so that the TEE, the GTEE and the SV-TEE can be calibrated together with the hidden volatility process estimation. The empirical study tests the adaptive efficiency of four markets - two developed (London Stock Exchange and New York Stock Exchange), a mature developing (Johannesburg Stock Exchange) and an immature developing (Nairobi Stock Exchange). The best-performing tests were selected for each market and it was observed that there were constant and adaptive efficiencies in the developed and mature developing markets, and constant inefficiency in the immature developing market. The SV-TEE was not selected as the best-performing test for any of the markets - possibly because the time period considered for each market was too short.
67

Extracting risk aversion estimates from option prices/implied volatility

Pillay, Aveshen January 2010 (has links)
The risk neutral density function is the distribution implied by the market price of derivative securities, namely options. It encloses the assumption that arbi-trage free conditions persist in the market. Given the historical evolution of stock prices, an investor will form some belief about the future progression of the stock price.
68

Reinsurance and dividend management

Marufu, Humphery January 2014 (has links)
Includes bibliographical references. / In this dissertation we set to find the dual optimal policy of a dividend payout scheme for shareholders with a risk-averse utility function and the retention level of received premiums for an insurance company with the option of reinsurance. We set the problem as a stochastic control problem. We then solve the resulting second-order partial differential equation known as Hamilton-Jacobi-Bellman equation. We find out that the optimal retention level is linear with the current reserve up to a point whereupon it is optimal for the insurance company to retain all business. As for the optimal dividend payout scheme, we find out that it is optimal for the company not to declare dividends and we make further explorations of this result.
69

The Credit Risk in Stock-Based Loans

Korula, Febin 06 February 2019 (has links)
Stock-based loans are an increasingly popular form of loan that are collateralised using stocks. Since these loans are often non-recourse loans, the lenders are subject to the risk that the collateral is worth less than the loan, and the borrower defaults. This dissertation will consider the credit risk faced by lenders when issuing these loans. To achieve this, this dissertation will propose different models to quantify this risk using various credit measures. A sensitivity analysis to key model parameters is then conducted. Some brief comments about capital requirements will also be made.
70

Estimating credit default swap spreads from equity data

Kooverjee, Jateen January 2014 (has links)
Includes bibliographical references. / Corporate bonds are an attractive form of investment as they provide higher returns than government bonds. This increase in returns is usually associated with an increase in risk. These risks include liquidity, market and credit risk. This dissertation will focus on the modelling of a corporate bond's credit risk by considering how to estimate the credit default swap (CDS) spread of a firm's bond. A structural credit model will be used to do this. In this dissertation, we implement an extension of Merton's model by Hull, Nelken and White (2004), which is based on the use of the implied volatilities of options on the company's stock to estimate model parameters. Such an approach provides an insight into the relationship between credit markets and options markets.

Page generated in 0.1137 seconds