• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 14
  • 1
  • Tagged with
  • 41
  • 41
  • 38
  • 38
  • 37
  • 17
  • 11
  • 11
  • 9
  • 7
  • 7
  • 7
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

'n Ondersoek na die eindige steekproefgedrag van inferensiemetodes in ekstreemwaarde-teorie

Van Deventer, Dewald 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2005. / Extremes are unusual or rare events. However, when such events – for example earthquakes, tidal waves and market crashes - do take place, they typically cause enormous losses, both in terms of human lives and monetary value. For this reason, it is of critical importance to accurately model extremal events. Extreme value theory entails the development of statistical models and techniques in order to describe and model such rare observations. In this document we discuss aspects of extreme value theory. This theory consists of two approaches: The classical maxima method, based on the properties of the maximum of a sample and the more popular threshold theory, based upon the properties of exceedances of a specified threshold value. This document provides the practitioner with the theoretical and practical tools for both these approaches. This will enable him/her to perform extreme value analyses with confidence. Extreme value theory – for both approaches - is based upon asymptotic arguments. For finite samples, the limiting result for the sample maximum holds approximately only. Similarly, for finite choices of the threshold, the limiting distribution for exceedances of that threshold holds only approximately. In this document we investigate the quality of extreme value based inferences with regard to the unknown underlying distribution when the sample size or threshold is finite. Estimation of extreme tail quantiles of the underlying distribution, as well as the calculation of confidence intervals, are typically the most important objectives of an extreme analysis. For that reason, we evaluate the accuracy of extreme based inferences in terms of these estimates. This investigation was carried out using a simulation study, performed with the software package S-Plus.
22

Confidence intervals for estimators of welfare indices under complex sampling

Kirchoff, Retha 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2010. / ENGLISH ABSTRACT: The aim of this study is to obtain estimates and confidence intervals for welfare indices under complex sampling. It begins by looking at sampling in general with specific focus on complex sampling and weighting. For the estimation of the welfare indices, two resampling techniques, viz. jackknife and bootstrap, are discussed. They are used for the estimation of bias and standard error under simple random sampling and complex sampling. Three con dence intervals are discussed, viz. standard (asymptotic), percentile and bootstrap-t. An overview of welfare indices and their estimation is given. The indices are categorized into measures of poverty and measures of inequality. Two Laeken indices, viz. at-risk-of-poverty and quintile share ratio, are included in the discussion. The study considers two poverty lines, namely an absolute poverty line based on percy (ratio of total household income to household size) and a relative poverty line based on equivalized income (ratio of total household income to equivalized household size). The data set used as surrogate population for the study is the Income and Expenditure survey 2005/2006 conducted by Statistics South Africa and details of it are provided and discussed. An analysis of simulation data from the surrogate population was carried out using techniques mentioned above and the results were graphed, tabulated and discussed. Two issues were considered, namely whether the design of the survey should be considered and whether resampling techniques provide reliable results, especially for con dence intervals. The results were a mixed bag . Overall, however, it was found that weighting showed promise in many cases, especially in the improvement of the coverage probabilities of the con dence intervals. It was also found that the bootstrap resampling technique was reliable (by looking at standard errors). Further research options are mentioned as possible solutions towards the mixed results. / AFRIKAANSE OPSOMMING: Die doel van die studie is die verkryging van beramings en vertrouensintervalle vir welvaartsmaatstawwe onder komplekse steekproefneming. 'n Algemene bespreking van steekproefneming word gedoen waar daar spesi ek op komplekse steekproefneming en weging gefokus word. Twee hersteekproefnemingstegnieke, nl. uitsnit (jackknife)- en skoenlushersteekproefneming, word bespreek as metodes vir die beraming van die maatstawwe. Hierdie maatstawwe word gebruik vir sydigheidsberaming asook die beraming van standaardfoute in eenvoudige ewekansige steekproefneming asook komplekse steekproefneming. Drie vertrouensintervalle word bespreek, nl. die standaard (asimptotiese), die persentiel en die bootstrap-t vertrouensintervalle. Daar is ook 'n oorsigtelike bespreking oor welvaartsmaatstawwe en die beraming daarvan. Hierdie welvaartsmaatstawwe vorm twee kategorieë, nl. maatstawwe van armoede en maatstawwe van ongelykheid. Ook ingesluit by hierdie bespreking is die at-risk-of-poverty en quintile share ratio wat deel vorm van die Laekenindekse. Twee armoedemaatlyne , 'n absolute- en relatiewemaatlyn, word in hierdie studie gebruik. Die absolute armoedemaatlyn word gebaseer op percy , die verhouding van die totale huishoudingsinkomste tot die grootte van die huishouding, terwyl die relatiewe armoedemaatlyn gebasseer word op equivalized income , die verhouding van die totale huishoudingsinkomste tot die equivalized grootte van die huishouding. Die datastel wat as surrogaat populasie gedien het in hierdie studie is die Inkomste en Uitgawe opname van 2005/2006 wat deur Statistiek Suid-Afrika uitgevoer is. Inligting met betrekking tot hierdie opname word ook gegee. Gesimuleerde data vanuit die surrogaat populasie is geanaliseer deur middel van die hersteekproefnemingstegnieke wat genoem is. Die resultate van die simulasie is deur middel van gra eke en tabelle aangedui en bespreek. Vanuit die simulasie het twee vrae opgeduik, nl. of die ontwerp van 'n steekproef, dus weging, in ag geneem behoort te word en of die hersteekproefnemingstegnieke betroubare resultate lewer, veral in die geval van die vertrouensintervalle. Die resultate wat verkry is, het baie gevarieer. Daar is egter bepaal dat weging in die algemeen belowende resultate opgelewer het vir baie van die gevalle, maar nie vir almal nie. Dit het veral die dekkingswaarskynlikhede van die vertrouensintervalle verbeter. Daar is ook bepaal, deur na die standaardfoute van die skoenlusberamers te kyk, dat die skoenlustegniek betroubare resultate gelewer het. Verdere navorsingsmoontlikhede is genoem as potensiële verbeteringe op die gemengde resultate wat verkry is.
23

Calculation aspects of the European Rebalanced Basket Option using Monte Carlo methods

Van der Merwe, Carel Johannes 12 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science)--University of Stellenbosch, 2010. / ENGLISH ABSTRACT: Life insurance and pension funds offer a wide range of products that are invested in a mix of assets. These portfolios (II), underlying the products, are rebalanced back to predetermined fixed proportions on a regular basis. This is done by selling the better performing assets and buying the worse performing assets. Life insurance or pension fund contracts can offer the client a minimum payout guarantee on the contract by charging them an extra premium (a). This problem can be changed to that of the pricing of a put option with underlying . It forms a liability for the insurance firm, and therefore needs to be managed in terms of risks as well. This can be done by studying the option’s sensitivities. In this thesis the premium and sensitivities of this put option are calculated, using different Monte Carlo methods, in order to find the most efficient method. Using general Monte Carlo methods, a simplistic pricing method is found which is refined by applying mathematical techniques so that the computational time is reduced significantly. After considering Antithetic Variables, Control Variates and Latin Hypercube Sampling as variance reduction techniques, option prices as Control Variates prove to reduce the error of the refined method most efficiently. This is improved by considering different Quasi-Monte Carlo techniques, namely Halton, Faure, normal Sobol’ and other randomised Sobol’ sequences. Owen and Faure-Tezuke type randomised Sobol’ sequences improved the convergence of the estimator the most efficiently. Furthermore, the best methods between Pathwise Derivatives Estimates and Finite Difference Approximations for estimating sensitivities of this option are found. Therefore by using the refined pricing method with option prices as Control Variates together with Owen and Faure-Tezuke type randomised Sobol’ sequences as a Quasi-Monte Carlo method, more efficient methods to price this option (compared to simplistic Monte Carlo methods) are obtained. In addition, more efficient sensitivity estimators are obtained to help manage risks. / AFRIKAANSE OPSOMMING: Lewensversekering en pensioenfondse bied die mark ’n wye reeks produkte wat belê word in ’n mengsel van bates. Hierdie portefeuljes (II), onderliggend aan die produkte, word op ’n gereelde basis terug herbalanseer volgens voorafbepaalde vaste proporsies. Dit word gedoen deur bates wat beter opbrengste gehad het te verkoop, en bates met swakker opbrengste aan te koop. Lewensversekeringof pensioenfondskontrakte kan ’n kliënt ’n verdere minimum uitbetaling aan die einde van die kontrak waarborg deur ’n ekstra premie (a) op die kontrak te vra. Die probleem kan verander word na die prysing van ’n verkoopopsie met onderliggende bate . Hierdie vorm deel van die versekeringsmaatskappy se laste en moet dus ook bestuur word in terme van sy risiko’s. Dit kan gedoen word deur die opsie se sensitiwiteite te bestudeer. In hierdie tesis word die premie en sensitiwiteite van die verkoopopsie met behulp van verskillende Monte Carlo metodes bereken, om sodoende die effektiefste metode te vind. Deur die gebruik van algemene Monte Carlo metodes word ’n simplistiese prysingsmetode, wat verfyn is met behulp van wiskundige tegnieke wat die berekeningstyd wesenlik verminder, gevind. Nadat Antitetiese Veranderlikes, Kontrole Variate en Latynse Hiperkubus Steekproefneming as variansiereduksietegnieke oorweeg is, word gevind dat die verfynde metode se fout die effektiefste verminder met behulp van opsiepryse as Kontrole Variate. Dit word verbeter deur verskillende Quasi-Monte Carlo tegnieke, naamlik Halton, Faure, normale Sobol’ en ander verewekansigde Sobol’ reekse, te vergelyk. Die Owen en Faure-Tezuke tipe verewekansigde Sobol’ reeks verbeter die konvergensie van die beramer die effektiefste. Verder is die beste metode tussen Baanafhanklike Afgeleide Beramers en Eindige Differensie Benaderings om die sensitiwiteit vir die opsie te bepaal, ook gevind. Deur dus die verfynde prysingsmetode met opsiepryse as Kontrole Variate, saam met Owen en Faure-Tezuke tipe verewekansigde Sobol’ reekse as ’n Quasi-Monte Carlo metode te gebruik, word meer effektiewe metodes om die opsie te prys, gevind (in vergelyking met simplistiese Monte Carlo metodes). Verder is meer effektiewe sensitiwiteitsberamers as voorheen gevind wat gebruik kan word om risiko’s te help bestuur.
24

Non-parametric regression modelling of in situ fCO2 in the Southern Ocean

Pretorius, Wesley Byron 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: The Southern Ocean is a complex system, where the relationship between CO2 concentrations and its drivers varies intra- and inter-annually. Due to the lack of readily available in situ data in the Southern Ocean, a model approach was required which could predict the CO2 concentration proxy variable, fCO2. This must be done using predictor variables available via remote measurements to ensure the usefulness of the model in the future. These predictor variables were sea surface temperature, log transformed chlorophyll-a concentration, mixed layer depth and at a later stage altimetry. Initial exploratory analysis indicated that a non-parametric approach to the model should be taken. A parametric multiple linear regression model was developed to use as a comparison to previous studies in the North Atlantic Ocean as well as to compare with the results of the non-parametric approach. A non-parametric kernel regression model was then used to predict fCO2 and nally a combination of the parametric and non-parametric regression models was developed, referred to as the mixed regression model. The results indicated, as expected from exploratory analyses, that the non-parametric approach produced more accurate estimates based on an independent test data set. These more accurate estimates, however, were coupled with zero estimates, caused by the curse of dimensionality. It was also found that the inclusion of salinity (not available remotely) improved the model and therefore altimetry was chosen to attempt to capture this e ect in the model. The mixed model displayed reduced errors as well as removing the zero estimates and hence reducing the variance of the error rates. The results indicated that the mixed model is the best approach to use to predict fCO2 in the Southern Ocean and that altimetry's inclusion did improve the prediction accuracy. / AFRIKAANSE OPSOMMING: Die Suidelike Oseaan is 'n komplekse sisteem waar die verhouding tussen CO2 konsentrasies en die drywers daarvoor intra- en interjaarliks varieer. 'n Tekort aan maklik verkrygbare in situ data van die Suidelike Oseaan het daartoe gelei dat 'n model benadering nodig was wat die CO2 konsentrasie plaasvervangerveranderlike, fCO2, kon voorspel. Dié moet gedoen word deur om gebruik te maak van voorspellende veranderlikes, beskikbaar deur middel van afgeleë metings, om die bruikbaarheid van die model in die toekoms te verseker. Hierdie voorspellende veranderlikes het ingesluit see-oppervlaktetemperatuur, log getransformeerde chloro l-a konsentrasie, gemengde laag diepte en op 'n latere stadium, hoogtemeting. 'n Aanvanklike, ondersoekende analise het aangedui dat 'n nie-parametriese benadering tot die data geneem moet word. 'n Parametriese meerfoudige lineêre regressie model is ontwikkel om met die vorige studies in die Noord-Atlantiese Oseaan asook met die resultate van die nieparametriese benadering te vergelyk. 'n Nie-parametriese kern regressie model is toe ingespan om die fCO2 te voorspel en uiteindelik is 'n kombinasie van die parametriese en nie-parametriese regressie modelle ontwikkel vir dieselfde doel, wat na verwys word as die gemengde regressie model. Die resultate het aangetoon, soos verwag uit die ondersoekende analise, dat die nie-parametriese benadering meer akkurate beramings lewer, gebaseer op 'n onafhanklike toets datastel. Dié meer akkurate beramings het egter met "nul"beramings gepaartgegaan wat veroorsaak word deur die vloek van dimensionaliteit. Daar is ook gevind dat die insluiting van soutgehalte (nie beskikbaar oor via sateliet nie) die model verbeter en juis daarom is hoogtemeting gekies om te poog om hierdie e ek in die model vas te vang. Die gemengde model het kleiner foute getoon asook die "nul"beramings verwyder en sodoende die variasie van die foutkoerse verminder. Die resultate het dus aangetoon dat dat die gemengde model die beste benadering is om te gebruik om die fCO2 in die Suidelike Oseaan te beraam en dat die insluiting van altimetry die akkuraatheid van hierdie beraming verbeter.
25

A brief introduction to basic multivariate economic statistical process control

Mudavanhu, Precious 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: Statistical process control (SPC) plays a very important role in monitoring and improving industrial processes to ensure that products produced or shipped to the customer meet the required specifications. The main tool that is used in SPC is the statistical control chart. The traditional way of statistical control chart design assumed that a process is described by a single quality characteristic. However, according to Montgomery and Klatt (1972) industrial processes and products can have more than one quality characteristic and their joint effect describes product quality. Process monitoring in which several related variables are of interest is referred to as multivariate statistical process control (MSPC). The most vital and commonly used tool in MSPC is the statistical control chart as in the case of the SPC. The design of a control chart requires the user to select three parameters which are: sample size, n , sampling interval, h and control limits, k.Several authors have developed control charts based on more than one quality characteristic, among them was Hotelling (1947) who pioneered the use of the multivariate process control techniques through the development of a 2 T -control chart which is well known as Hotelling 2 T -control chart. Since the introduction of the control chart technique, the most common and widely used method of control chart design was the statistical design. However, according to Montgomery (2005), the design of control has economic implications. There are costs that are incurred during the design of a control chart and these are: costs of sampling and testing, costs associated with investigating an out-of-control signal and possible correction of any assignable cause found, costs associated with the production of nonconforming products, etc. The paper is about giving an overview of the different methods or techniques that have been employed to develop the different economic statistical models for MSPC. The first multivariate economic model presented in this paper is the economic design of the Hotelling‟s 2 T -control chart to maintain current control of a process developed by Montgomery and Klatt (1972). This is followed by the work done by Kapur and Chao (1996) in which the concept of creating a specification region for the multiple quality characteristics together with the use of a multivariate quality loss function is implemented to minimize total loss to both the producer and the customer. Another approach by Chou et al (2002) is also presented in which a procedure is developed that simultaneously monitor the process mean and covariance matrix through the use of a quality loss function. The procedure is based on the test statistic 2ln L and the cost model is based on Montgomery and Klatt (1972) as well as Kapur and Chao‟s (1996) ideas. One example of the use of the variable sample size technique on the economic and economic statistical design of the control chart will also be presented. Specifically, an economic and economic statistical design of the 2 T -control chart with two adaptive sample sizes (Farazet al, 2010) will be presented. Farazet al (2010) developed a cost model of a variable sampling size 2 T -control chart for the economic and economic statistical design using Lorenzen and Vance‟s (1986) model. There are several other approaches to the multivariate economic statistical process control (MESPC) problem, but in this project the focus is on the cases based on the phase II stadium of the process where the mean vector, and the covariance matrix, have been fairly well established and can be taken as known, but both are subject to assignable causes. This latter aspect is often ignored by researchers. Nevertheless, the article by Farazet al (2010) is included to give more insight into how more sophisticated approaches may fit in with MESPC, even if the mean vector, only may be subject to assignable cause. Keywords: control chart; statistical process control; multivariate statistical process control; multivariate economic statistical process control; multivariate control chart; loss function. / AFRIKAANSE OPSOMMING: Statistiese proses kontrole (SPK) speel 'n baie belangrike rol in die monitering en verbetering van industriële prosesse om te verseker dat produkte wat vervaardig word, of na kliënte versend word wel aan die vereiste voorwaardes voldoen. Die vernaamste tegniek wat in SPK gebruik word, is die statistiese kontrolekaart. Die tradisionele wyse waarop statistiese kontrolekaarte ontwerp is, aanvaar dat ‟n proses deur slegs 'n enkele kwaliteitsveranderlike beskryf word. Montgomery and Klatt (1972) beweer egter dat industriële prosesse en produkte meer as een kwaliteitseienskap kan hê en dat hulle gesamentlik die kwaliteit van 'n produk kan beskryf. Proses monitering waarin verskeie verwante veranderlikes van belang mag wees, staan as meerveranderlike statistiese proses kontrole (MSPK) bekend. Die mees belangrike en algemene tegniek wat in MSPK gebruik word, is ewe eens die statistiese kontrolekaart soos dit die geval is by SPK. Die ontwerp van 'n kontrolekaart vereis van die gebruiker om drie parameters te kies wat soos volg is: steekproefgrootte, n , tussensteekproefinterval, h en kontrolegrense, k . Verskeie skrywers het kontrolekaarte ontwikkel wat op meer as een kwaliteitseienskap gebaseer is, waaronder Hotelling wat die gebruik van meerveranderlike proses kontrole tegnieke ingelei het met die ontwikkeling van die T2 -kontrolekaart wat algemeen bekend is as Hotelling se 2 T -kontrolekaart (Hotelling, 1947). Sedert die ingebruikneming van die kontrolekaart tegniek is die statistiese ontwerp daarvan die mees algemene benadering en is dit ook in daardie formaat gebruik. Nietemin, volgens Montgomery and Klatt (1972) en Montgomery (2005), het die ontwerp van die kontrolekaart ook ekonomiese implikasies. Daar is kostes betrokke by die ontwerp van die kontrolekaart en daar is ook die kostes t.o.v. steekproefneming en toetsing, kostes geassosieer met die ondersoek van 'n buite-kontrole-sein, en moontlike herstel indien enige moontlike korreksie van so 'n buite-kontrole-sein gevind word, kostes geassosieer met die produksie van niekonforme produkte, ens. In die eenveranderlike geval is die hantering van die ekonomiese eienskappe al in diepte ondersoek. Hierdie werkstuk gee 'n oorsig oor sommige van die verskillende metodes of tegnieke wat al daargestel is t.o.v. verskillende ekonomiese statistiese modelle vir MSPK. In die besonder word aandag gegee aan die gevalle waar die vektor van gemiddeldes sowel as die kovariansiematriks onderhewig is aan potensiële verskuiwings, in teenstelling met 'n neiging om slegs na die vektor van gemiddeldes in isolasie te kyk synde onderhewig aan moontlike verskuiwings te wees.
26

The effect of liquidity on stock returns on the JSE

Reisinger, Astrid Kim 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: This thesis examines the effect of liquidity on excess stock returns on the Johannesburg Stock Exchange (JSE) over the period 2003 to 2011. It builds on the findings of previous studies that found size, value and momentum effects to be significant in explaining market anomalies by adding a further explanatory factor, namely liquidity. A standard CAPM, as well as a momentum-augmented Fama-French (1993: 3) model are employed to perform regression analyses to examine the effect of the four variables on excess stock returns. Results suggested that the log of the stock‘s market value best captured the size effect, the earnings yield best captured the value effect and the previous three month‘s returns best captured the momentum effect. Five liquidity proxies are used: the bid-ask spread first proposed by Amihud (1986: 223), turnover, the price impact measure of Amihud (2002: 31) and two zero return measures proposed by Lesmond et al. (1999: 1113). Despite prior studies having found liquidity to be an influential factor, this thesis found the opposite to be true. This finding remains robust, irrespective of the type of liquidity measure used. While size, value and momentum are found to be significant to a certain extent in explaining excess stock returns over the period, liquidity is not found to be significant. This is a surprising result, given that the JSE is seen as an emerging market, which is generally regarded as illiquid. This fact is exacerbated by the fact that the JSE is a highly concentrated and therefore skewed market that is dominated by only a handful of shares. Hence liquidity is expected to be of utmost importance. The result that liquidity is however not a priced factor on this market is therefore an important finding that requires further analysis to determine why this is the case. In addition, significant non-zero intercepts remained, indicating continued missing risk factors. / AFRIKAANSE OPSOMMING: In hierdie tesis word die effek van likiditeit op oormaat aandeel-opbrengste op die Johannesburg Effektebeurs (JEB) ondersoek gedurende die periode 2003 tot 2011. Dit bou voort op die bevindinge van vorige studies wat toon dat grootte, waarde en momentum beduidend is in die verklaring van mark onreëlmatighede deur 'n addisionele verklarende faktor, likiditeit, toe te voeg. 'n Standaard kapitaalbateprysingsmodel (KBPM) sowel as 'n momentum-aangepaste Fama-French (1993: 3) model word gebruik om deur middel van regressie analise die effek van die vier veranderlikes op oormaat aandeel-opbrengste te ondersoek. Die resultate toon dat die grootte effek die beste verteenwoordig word deur die logaritme van die aandeel se mark kapitalisasie, die verdienste-opbrengs verteenwoordig die waarde effek en die vorige drie-maande opbrengskoerse verteenwoordig die momentum effek die beste. Vyf likiditeitsveranderlikes is gebruik: bod-en-aanbod spreiding voorgestel deur Amihud (1986: 223), omset, die prys-impak maatstaf van Amihud (2002: 31) en twee nul-opbrengskoers maatstawwe voorgestel deur Lesmond et al. (1999: 1113). Afgesien van die feit dat vorige studies die effek van likiditeit beduidend vind, word die teenoorgestelde in hierdie tesis gevind. Hierdie bevinding bly robuus, ongeag van die likiditeitsveranderlike wat gebruik word. Terwyl bevind is dat grootte, waarde en momentum beduidend is tot 'n sekere mate in die verklaring van oormaat aandeel-opbrengste tydens die periode, is geen aanduiding dat likiditeit 'n addisionele beduidende verklarende faktor is gevind nie. Hierdie bevinding is onverwags, aangesien die JEB beskou word as 'n ontluikende mark, wat normaalweg illikied is. Hierdie feit word vererger deur dat die JEB hoogs gekonsentreerd is en dus 'n skewe mark is wat oorheers word deur slegs 'n hand vol aandele. Dus word verwag dat likiditeit 'n baie belangrike faktor behoort te wees. Die bevinding dat likiditeit nie 'n prysingsfaktor op hierdie mark is nie, is dus 'n belangrike bevinding en vereis verdere analise om vas te stel waarom dit die geval is. Addisioneel word beduidende nie-nul afsnitte verkry, wat aandui dat daar steeds risiko faktore is wat nog nie geïdentifiseer is nie.
27

Portfolio Opportunity Distributions (PODs) for the South African market : based on regulation requirements

Nortje, Hester Maria 04 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2014. / ENGLISH ABSTRACT: In this study Portfolio Opportunity Distributions (PODs) is applied as an alternative performance evaluation method. Traditionally, Broad-Market Indices or peer group comparisons are used to perform performance evaluation. These methods however have various biases and other problems related to its use. These biases and problems include composition bias, classification bias, concentration, etc. R.J. Surz (1994) introduced PODs in order to eliminate some of these problems. Each fund has its own opportunity set based on its style mandate and constraints. The style mandate of the fund is determined by calculating the fund’s exposure to the nine Surz Style Indices through the use of Returns-Based Style Analysis (RBSA). The indices are created based on the style proposed by R.J. Surz (1994). Some adjustments were made to incorporate the unique nature of the South African equity market. The combination of the fund’s exposures to the indices best explains the return that the fund generated. In this paper the fund’s constraints are based on the regulation requirements imposed on the funds in South Africa by the Collective Investment Schemes Control Act No. 45 of 2002 (CISCA). Thousands of random portfolios are then generated based on the fund’s opportunity set. The return and risk of the simulated portfolios represent the possible investment outcomes that the manager could have achieved given its opportunity set. Together the return and risk of the simulated portfolios represent a range of possible outcomes against which the performance of the fund is compared. It is also possible to determine the skill of the manager since it can be concluded that a manager who consistently outperforms most of the simulated portfolios shows skill in selecting shares to be included in the portfolio and assigning the correct weights to these shares. The South African Rand depreciated quite a bit during the period under evaluation and therefore funds invested large portions of their assets in foreign investments. These investments mostly yielded very high or very low returns compared to the returns available in the domestic equity market which impacted the application of PODs. Although the PODs methodology shows great potential, it is impossible to conclude with certainty whether the PODs methodology is superior to the traditional methods based on the current data. / AFRIKAANSE OPSOMMING: In hierdie studie word Portefeulje Geleentheids Verdelings (“PODs”) bekendgestel as ‘n alternatiewe manier om die obrengste van bestuurders te evalueer. Gewoonlik word indekse en die vergelyking van die fonds met soortgelyke fondse gebruik om fondse te evalueer. Die metodes het egter verskeie probleme wat met die gebruik daarvan verband hou. Die probleme sluit onder andere in: die samestelling en klassifikasie van soortgelyke fondse, die konsentrasie in die mark, ens. R.J. Surz (1994) het dus Portefeulje Geleentheids Verdelings (“PODs”) bekendgestel in ‘n poging om sommige van die probeleme te elimineer. Elke fonds het sy eie unieke geleentheids versameling wat gebaseer is op die fonds se styl en enige beperkings wat op die fonds van toepassing is. Die fonds se styl word bepaal deur die fonds se blootstelling aan die nege Surz Styl Indekse te meet met behulp van opbrengs-gebaseerde styl analise (“RBSA”). Die indekse is geskep gebaseer op die metode wat deur R.J. Surz (1994) voorgestel is. Daar is egter aanpassings gemaak om die unieke aard van die Suid-Afrikaanse aandele mark in ag te neem. Die kombinasie van die fonds se blootstelling aan die indekse verduidelik waar die fonds se opbrengs vandaan kom. In die navorsingstuk is die beperkings wat van toepassing is op die fonds afkomstig uit die regulasie vereistes wat deur die “Collective Investment Schemes Control Act No. 45 of 2002 (CISCA)” in Suid-Afrika op fondse van toepassing is. Duisende ewekansige portefeuljes word dan gegenereer gebaseer op die fonds se unieke groep aandele waarin die fonds kan belê. Die opbrengs en risiko van die gesimuleerde portefeuljes verteenwoordig al die moontlike beleggings uitkomste wat die fonds bestuurder kon gegenereer het gegewe die fonds se unieke groep aandele waarin dit kon belê. Die opbrengs en risiko van al die gesimuleerde portefeuljes skep saam ‘n verdeling van moontlike beleggings uitkomste waarteen die opbrengs en risiko van die fonds vergelyk word. Hierdie proses maak dit moontlik om die fonds bestuurder se vermoë om beter as meeste van die gesimuleerde portefeuljes te presteer te bepaal. Die aanname kan gemaak word dat ‘n bestuurder wat konsekwent oor tyd beter as meeste van die gesimuleerde portefeuljes presteer oor die vermoë beskik om die regte aandele te kies om in die portefeulje in te sluit en ook die regte gewigte aan die aandele toe te ken. Die Suid-Afrikaanse Rand het heelwat gedepresieer tydens die evaluasie periode en daarom het fondse groot porsies van hul beleggings oorsee belê. Die beleggings het dus of heelwat groter of heelwat kleiner opbrengste gehad in vergelyking met die opbrengste beskikbaar in die plaaslike aandelemark en dit het die toepassing van PODs beïnvloed. PODs toon baie potential, maar dit is egter onmoontlik om met die huidige data stel vas te stel of dit ‘n beter metode is.
28

Optimal asset allocation for South African pension funds under the revised Regulation 28

Koegelenberg, Frederik Johannes 03 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: On 1 July 2011 the revised version of Regulation 28, which governs the South African pension fund industry with regard to investments, took effect. The new version allows for pension funds to invest up to 25 percent compared to 20 percent, in the previous version, of its total investment in foreign assets. The aim of this study is to determine whether it would be optimal for a South African pension fund to invest the full 25 percent of its portfolio in foreign assets. Seven different optimization models are evaluated in this study to determine the optimal asset mix. The optimization models were selected through an extensive literature study in order to address key optimization issues, e.g. which risk measure to use, whether parametric or non parametric optimization should be used and if the Mean Variance model for optimization defined by Markowitz, which has been the benchmark with regard to asset allocation, is the best model to determine the long term asset allocation strategies. The results obtained from the different models were used to recommend the optimal long term asset allocation for a South African pension fund and also compared to determine which optimization model proved to be the most efficient. The study found that when using only the past ten years of data to construct the portfolios, it would have been optimal to invest in only South African asset classes with statistical differences with regard to returns in some cases. Using the past 20-years of data to construct the optimal portfolios provided mixed results, while the 30-year period were more in favour of an international portfolio with the full 25% invested in foreign asset classes. A comparison of the different models provided a clear winner with regard to a probability of out performance. The Historical Resampled Mean Variance optimization provided the highest probability of out performing the benchmark. From the study it also became evident that a 20-year data period is the optimal period when considering the historical data that should be used to construct the optimal portfolio. / AFRIKAANSE OPSOMMING: Op 1 Julie 2011 het die hersiene Regulasie 28, wat die investering van Suid-Afrikaanse pensioenfondse reguleer, in werking getree. Hierdie hersiene weergawe stel pensioenfondse in staat om 25% van hulle fondse in buitelandse bateklasse te belê in plaas van 20%, soos in die vorige weergawe. Hierdie studie stel vas of dit werklik voordelig sal wees vir ‘n SA pensioenfonds om die volle 25% in buitelandse bateklasse te belê. Sewe verskillende optimeringsmodelle is gebruik om die optimale portefeulje te probeer skep. Die optimeringsmodelle is gekies na ’n uitgebreide literatuurstudie sodat van die sleutelkwessies met betrekking tot optimering aangespreek kon word. Die kwessies waarna verwys word sluit in, watter risikomaat behoort gebruik te word in die optimeringsproses, of ‘n parametriese of nie-parametriese model gebruik moet word en of die “Mean-Variance” model wat deur Markowitz in 1952 gedefinieer is en al vir baie jare as maatstaf vir portefeulje optimering dien, nog steeds die beste model is om te gebruik. Die uiteindelike resultate, verkry van die verskillende optimeringsmodelle, is gevolglik gebruik om die optimale langtermyn bate-allokasie vir ‘n Suid-Afrikaanse pensioenfonds op te stel. Die verskillende optimeringsmodelle is ook met mekaar vergelyk om te bepaal of daar ‘n model is wat beter is as die res. Vanuit die resultate was dit duidelik dat ’n portfeulje wat slegs uit Suid-Afrikaanse bates bestaan beter sal presteer as slegs die laaste 10-jaar se data gebruik word om die portefeulje op stel. Hierdie resultate is ook in meeste van die gevalle bevestig deur middel van hipotese toetse. Deur gebruik te maak van die afgelope 20-jaar se data om die portefeuljes op te stel, het gemengde resultate gelewer, terwyl die afgelope 30-jaar se data in meeste van die gevalle ’n internasionaal gediversifiseerde portefeulje as die beter portefeulje uitgewys het. In ’n vergelyking van die verskillende optimeringsmodelle is die “Historical Resampled Mean Variance” model duidelik as die beter model uitgewys. Hierdie model het die hoogste waarskynlikheid behaal om die vasgstelde maatstafportefeuljes uit te presteer. Die resultate het ook gedui op die 20-jaar periode as die beste data periode om te gebruik as die optimale portfeulje opgestel word.
29

South African security market imperfections

Jooste, Dirk 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2006. / In recent times many theories have surfaced posing challenging threats to the Efficient Market Hypothesis. We are entering an exciting era of financial economics fueled by the urge to have a better understanding of the intricate workings of financial markets. Many studies are emerging that investigate the relationship between stock market predictability and efficiency. This paper studies the existence of calendar-based patterns in equity returns, price momentum and earnings momentum in the South African securities market. These phenomena are commonly referred to in the literature as security market imperfections, financial market puzzles and market anomalies. We provide evidence that suggests that they do exist in the South African context, which is consistent with findings in various international markets. A vast number of papers on the subject exist in the international arena. However, very few empirical studies on the South African market can be found in the public domain. We aim to contribute to the literature by investigating the South African case.
30

Aspects of some exotic options

Theron, Nadia 12 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2007. / The use of options on various stock markets over the world has introduced a unique opportunity for investors to hedge, speculate, create synthetic financial instruments and reduce funding and other costs in their trading strategies. The power of options lies in their versatility. They enable an investor to adapt or adjust her position according to any situation that arises. Another benefit of using options is that they provide leverage. Since options cost less than stock, they provide a high-leverage approach to trading that can significantly limit the overall risk of a trade, or provide additional income. This versatility and leverage, however, come at a price. Options are complex securities and can be extremely risky. In this document several aspects of trading and valuing some exotic options are investigated. The aim is to give insight into their uses and the risks involved in their trading. Two volatility-dependent derivatives, namely compound and chooser options; two path-dependent derivatives, namely barrier and Asian options; and lastly binary options, are discussed in detail. The purpose of this study is to provide a reference that contains both the mathematical derivations and detail in valuating these exotic options, as well as an overview of their applicability and use for students and other interested parties.

Page generated in 0.1517 seconds