• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7490
  • 2116
  • 1527
  • 586
  • 532
  • 378
  • 378
  • 378
  • 378
  • 378
  • 367
  • 320
  • 219
  • 174
  • 66
  • Tagged with
  • 15836
  • 7501
  • 2039
  • 1907
  • 1526
  • 1324
  • 1288
  • 1255
  • 1184
  • 1079
  • 1066
  • 1037
  • 1000
  • 930
  • 820
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

A stochastic model for daily climate

Brandão, Anabela de Gusmão January 1986 (has links)
Includes bibliography. / This thesis describes the results of a study to establish whether climate variables could be usefully modelled on a daily basis. Three stochastic models are considered for the description of daily climate sequences, which can then be used to generate artificial sequences. The climate variables under consideration are rainfall, maximum and minimum temperature, evaporation, sunshine duration, windrun and maximum and minimum humidity. A simple Markov chain-Weibull model is proposed to model rainfall. Three multivariate models (one proposed by Richardson (1981), two new) are suggested for modelling the remaining climate variables. The model parameters are allowed to vary seasonally, while the error term is assumed to follow an autoregressive process. The models were validated and their general performance·was found to be satisfactory. Some weaknesses were identified and are discussed. The. main conclusion of this study is that daily climate sequences can indeed be usefully described by means of stochastic models.
152

Structural time series modelling for 18 years of Kapenta fishing in Lake Kariba

Dalmeyer, Lara January 2012 (has links)
Includes abstract. Includes bibliographical references.
153

Investigating 'optimal' kriging variance estimation :analytic and bootstrap estimators

Ngwenya, Mzabalazo Z January 2011 (has links)
Kriging is a widely used group of techniques for predicting unobserved responses at specified locations using a set of observations obtained from known locations. Kriging predictors are best linear unbiased predictors (BLUPs) and the precision of predictions obtained from them are assessed by the mean squared prediction error (MSPE), commonly termed the kriging variance.
154

Statistical arbitrage in South African equity markets

Masindi, Khuthadzo January 2014 (has links)
The dissertation implements a model driven statistical arbitrage strategy that uses the principal components from Principal Component Analysis as factors in a multi-factor stock model, to isolate the idiosyncratic component of returns, which is then modelled as an Ornstein Uhlenbeck process. The idiosyncratic process (referred to as the residual process) is estimated in discrete-time by an auto-regressive process with one lag (or AR(1) process). Trading signals are generated based on the level of the residual process. This strategy is then evaluated over historical data for the South African equity market from 2001 to 2013 through backtesting. In addition the strategy is evaluated over data generated from Monte Carlo simulations as well as bootstrapped historical data. The results show that the strategy was able to significantly out-perform cash for most of the periods under consideration. The performance of the strategy over data that was generated from Monte Carlo simulations demonstrated that the strategy is not suitable for markets that are asymptotically efficient.
155

Volatility transformation in a multi-curve setting applied to caps and swaptions

Maxwell, Daniel January 2015 (has links)
Includes bibliographical references / The effects of the 2007-08 financial crisis have resulted in a sharp change in the way interest rate markets are viewed as well as modelled. As a result of the crisis, the general market framework has transitioned from a single curve framework to what is commonly known as the 'multiple-curve' framework. In addition to this, there is debate as to which curve to use for discounting. This dissertation will initially aim to give a succinct, yet thorough overview of the changes affecting interest rate modelling as a result of the financial crisis. In particular pricing methods that are consistent with the multi-curve framework are presented. Adaptations of the popular Libor Market Model (LMM) and Stochastic Alpha-Beta-Rho (SABR) consistent with the new market framework are also presented. The second aim of the dissertation is to outline and implement methods of transforming volatilities within this new market framework. The market quotes available for caps/floors and swaptions often assume a particular payment tenor, for example swaption volatilities are typically quoted assuming payment legs of six months. As such, if one wanted to price an identical swaption based on payment legs of three months, or even monthly payments, some form of transformation is needed. The methods presented and implemented are largely based on the work of Kienitz (2013). The methods described are implemented to transform six month cap and swaption volatility surfaces to three month surfaces.
156

The use of stochastic collocation for sampling from expensive distributions with applications in finance

Brand, Hilmarie January 2016 (has links)
The pricing of financial derivatives using numerical methods often requires sampling from expensive distributions. These are distributions with inverse cumulative distribution functions that are difficult to evaluate, thus requiring significant computation time. To mitigate this, Grzelak et al. (2015) introduced the stochastic collocation Monte Carlo sampler. This sampling method is based on a generalisation of the stochastic collocation method of Mathelin and Hussaini (Mathelin andHussaini, 2003) which was introduced in the context of solving stochastic partial differential equations (Babuˇska et al., 2007; Loeven et al., 2007).The stochastic collocation Monte Carlo sampling method entails sampling from a cheaper distribution and then transforming the samples to obtain realisations from the expensive distribution. The function that transforms the quantiles of the cheap distribution to the corresponding quantiles of the expensive distribution is approximated using an interpolating polynomial of a prespecified degree. The points at which the interpolating polynomial is constructed to exactly match the true quantile-to-quantile transformation function are known as collocation points. Any number of realisations from the expensive distribution may be read off using the interpolating polynomial, leading to a significant reduction in computation time when compared to methods like the inverse transform method. This dissertation provides an overview of the stochastic collocation method, using distributions and models frequently encountered in finance as examples. Where possible, goodness of fit tests are performed. The major contribution of the dissertation is the investigation of the roots of Chebyshev polynomials of the first kind as collocation points, as opposed to Gaussian quadrature points used by Babuˇska et al. (2007), Loeven et al. (2007) and Grzelak et al. (2015). The roots of the Chebyshev polynomials are constrained to lie in a specified closed interval and hence are convenient to use when the statistic to be estimated does not depend on the entire distribution of interest, e.g. option prices or conditional expectations like expected shortfall.
157

Covariance matrix estimation methods for constrained portfolio optimization in a South African setting

Madume, Jaison Pezisai January 2010 (has links)
One of the major topics of concern in Modern Portfolio Theory is portfolio optimization which is centred on the mean-variance framework. In order for this framework to be implemented, esti- mated parameters (covariance matrix for the constrained portfo- lio) are required. The problem with these estimated parameters is that they have to be extracted from historical data based on certain assumptions. Because of the di erent estimation methods that can be used the parameters thus obtained will su er either from estimation error or speci cation error. In order to obtain results that are realistic in the optimization, one needs then to establish covariance matrix estimators that are as good as possi- ble. This paper explores the various covariance matrix estimation methods in a South African setting focusing on the constrained portfolio. The empirical results show that the Ledoit shrinkage to a constant correlation method, the Principal Component Analy- sis method and the Portfolio of estimators method all perform as good as the Sample covariance matrix in the Ex-ante period but improve on it slightly in the Ex-post period. However, the im- provement is of a small magnitude, as a result the sample covari- ance matrix can be used in the constrained portfolio optimization in a South African setting.
158

The Ghana Stock Exchange: Concentration, Diversification, Liquidity

Kumi, Eric January 2010 (has links)
Analysts have foiled that concentration of portfolio weights affects portfolio risk. This is a unique feature in small markets whore they tend to be concentrated in few stocks and the Charm Sunk Exchange (GSE ) falls in that category. As a result portfolios based on the Ghana All Share index are highly concentrated. The risk in a portfolio is mainly attributed to Covariance and weighting structure. Enough cannot be done about the covariance structure but the weighting structure can be controlled since it depends mainly on investment choices. The weighting structure determines the degree of concentration of a portfolio. The term concentration refers to the extent to which portfolio weights skew away from equally weighted distribution of portfolio weights. As at September 2009, the Ghana All Share index has about five (5) of the total of thirty-five (3.5) in the index accounting for about 82.25% of the index weight. Concentration can be measured using the Herfindahl-Hirsclanan index (HHI) or Richard Roll measure (RRC). Diversification is (nmfirned with generation of returns from different sources. The iraditional method of measuring diversification has fallen short of vital, is usually expected hence the introduction of the new measure, portfolio diversification index or PDT. Liquidity measures the effect the quantity of stocks traded has on the market price of stocks. Liquidity varies from time to time; hence its importance as a source of risk for investors. The primary of fjective of this protect is to determine the significance of concentration in portiailio risk, particularly from the Ghanaian perspective. Furthermore, we will pleasing diversification ming the new measure and finally eml withshort review on liquidity in stock markets.
159

Recovery theorem: expounded and applied

Backwell, Alex January 2014 (has links)
Includes bibliographical references. / This dissertation is concerned with Ross' (2011) Recovery Theorem. It is generally held that a forward-looking probability distribution is unobtainable from derivative prices, because the market's risk-preferences are conceptually inextricable from the implied real-world distribution. Ross' result recovers this distribution without making the strong preference assumptions assumed necessary under the conventional paradigm. This dissertation aims to give the reader a thorough understanding of Ross Recovery, both from a theoretical and practical point of view. This starts with a formal delineation of the model and proof of the central result, motivated by the informal nature of Ross' working paper. This dissertation relaxes one of Ross' assumptions and arrives at the equivalent conclusion. This is followed by a critique of the model and assumptions. An a priori discussion only goes so far, but potentially problematic assumptions are identified, chief amongst which being time additive preferences of a representative agent. Attention is then turned to practical application of the theorem. The author identifies a number of obstacles to applying the result { some of which are somewhat atypical and have not been directly addressed in the literature { and suggests potential solutions. A salient obstacle is calibrating a state price matrix. This leads to an implementation of Ross Recovery on the FTSE/JSE Top40. The suggested approach is found to be workable, though certainly not the final word on the matter. A testing framework for the model is discussed and the dissertation is concluded with a consideration of the findings and the theorem's applicability.
160

Pairs trading: a copula approach

Augustine, Cecilia January 2014 (has links)
Includes bibliographical references. / Pairs trading is an arbitrage strategy that involves identifying a pair of stocks known to move together historically and trading on them when relative mispricing occurs. The strategy involves shorting the overvalued stock and simultaneously going long on the undervalued stock and closing the positions once the prices have returned to fair values. The cointegration method and the distance method are the most common techniques used in pairs trading strategy. However under these methods, the measure of divergence between the stocks or the spread is assumed to be symmetrically distributed about the mean zero. In addition, the spread is assumed to be a stationary time series (cointegration method) or mean-reverting (distance method). These assumptions are the main drawbacks of these methods and may lead to missed and/or inaccurate trading signals. The purpose of this dissertation is to explore an alternative approach to pairs trading by use of copulas. This dissertation aims to investigate if copulas can improve the profitability of pairs trading. To achieve this aim, results of pairs trading by use of copulas are compared against those of cointegration and distance methods.

Page generated in 0.1175 seconds