• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 52
  • 25
  • 1
  • Tagged with
  • 80
  • 80
  • 68
  • 68
  • 37
  • 31
  • 24
  • 17
  • 17
  • 14
  • 13
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Forecasting the monthly electricity consumption of municipalities in KwaZulu-Natal.

Walton, Alison Norma. January 1997 (has links)
Eskom is the major electricity supplier in South Africa and medium term forecasting within the company is a critical activity to ensure that enough electricity is generated to support the country's growth, that the networks can supply the electricity and that the revenue derived from electricity consumption is managed efficiently. This study investigates the most suitable forecasting technique for predicting monthly electricity consumption, one year ahead for four major municipalities within Kwa-Zulu Natal. / Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 1997.
62

The statistical analyses of a complex survey of banana pests and diseases in Uganda.

Ngoya, Japheth N. January 1999 (has links)
No abstract available. / Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 1999.
63

Statistical analysis of the incidence and mortality of African horse sickness in South Africa.

Burne, Rebecca. January 2011 (has links)
No abstract available. / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2011.
64

Multilevel modelling of HIV in Swaziland using frequentist and Bayesian approaches.

Vilakati, Sifiso E. January 2012 (has links)
Multilevel models account for different levels of aggregation that may be present in the data. Researchers are sometimes faced with the task of analysing data that are collected at different levels such that attributes about individual cases are provided as well as the attributes of groupings of these individual cases. Data with multilevel structure is common in the social sciences and other fields such as epidemiology. Ignoring hierarchies in data (where they exist) can have damaging consequences to subsequent statistical inference. This study applied multilevel models from frequentist and Bayesian perspectives to the Swaziland Demographic and Health Survey (SDHS) data. The first model fitted to the data was a Bayesian generalised linear mixed model (GLMM) using two estimation techniques: the Integrated Laplace Approximation (INLA) and Monte Carlo Markov Chain (MCMC) methods. The study aimed at identifying determinants of HIV in Swaziland and as well as comparing the different statistical models. The outcome variable of interest in this study is HIV status and it is binary, in all the models fitted the logit link was used. The results of the analysis showed that the INLA estimation approach is superior to the MCMC approach in Bayesian GLMMs in terms of computational speed. The INLA approach produced the results within seconds compared to the many minutes taken by the MCMC methods. There were minimal differences observed between the Bayesian multilevel model and the frequentist multilevel model. A notable difference observed between the Bayesian GLMMs and the the multilevel models is that of differing estimates for cluster effects. In the Bayesian GLMM, the estimates for the cluster effects are larger than the ones from the multilevel models. The inclusion of cluster level variables in the multilevel models reduced the unexplained group level variation. In an attempt to identify key drivers of HIV in Swaziland, this study found that age, age at first sex, marital status and the number of sexual partners one had in the last 12 months are associated with HIV serostatus. Weak between cluster variations were found in both men and women. / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2012.
65

Optimal asset allocation for South African pension funds under the revised Regulation 28

Koegelenberg, Frederik Johannes 03 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: On 1 July 2011 the revised version of Regulation 28, which governs the South African pension fund industry with regard to investments, took effect. The new version allows for pension funds to invest up to 25 percent compared to 20 percent, in the previous version, of its total investment in foreign assets. The aim of this study is to determine whether it would be optimal for a South African pension fund to invest the full 25 percent of its portfolio in foreign assets. Seven different optimization models are evaluated in this study to determine the optimal asset mix. The optimization models were selected through an extensive literature study in order to address key optimization issues, e.g. which risk measure to use, whether parametric or non parametric optimization should be used and if the Mean Variance model for optimization defined by Markowitz, which has been the benchmark with regard to asset allocation, is the best model to determine the long term asset allocation strategies. The results obtained from the different models were used to recommend the optimal long term asset allocation for a South African pension fund and also compared to determine which optimization model proved to be the most efficient. The study found that when using only the past ten years of data to construct the portfolios, it would have been optimal to invest in only South African asset classes with statistical differences with regard to returns in some cases. Using the past 20-years of data to construct the optimal portfolios provided mixed results, while the 30-year period were more in favour of an international portfolio with the full 25% invested in foreign asset classes. A comparison of the different models provided a clear winner with regard to a probability of out performance. The Historical Resampled Mean Variance optimization provided the highest probability of out performing the benchmark. From the study it also became evident that a 20-year data period is the optimal period when considering the historical data that should be used to construct the optimal portfolio. / AFRIKAANSE OPSOMMING: Op 1 Julie 2011 het die hersiene Regulasie 28, wat die investering van Suid-Afrikaanse pensioenfondse reguleer, in werking getree. Hierdie hersiene weergawe stel pensioenfondse in staat om 25% van hulle fondse in buitelandse bateklasse te belê in plaas van 20%, soos in die vorige weergawe. Hierdie studie stel vas of dit werklik voordelig sal wees vir ‘n SA pensioenfonds om die volle 25% in buitelandse bateklasse te belê. Sewe verskillende optimeringsmodelle is gebruik om die optimale portefeulje te probeer skep. Die optimeringsmodelle is gekies na ’n uitgebreide literatuurstudie sodat van die sleutelkwessies met betrekking tot optimering aangespreek kon word. Die kwessies waarna verwys word sluit in, watter risikomaat behoort gebruik te word in die optimeringsproses, of ‘n parametriese of nie-parametriese model gebruik moet word en of die “Mean-Variance” model wat deur Markowitz in 1952 gedefinieer is en al vir baie jare as maatstaf vir portefeulje optimering dien, nog steeds die beste model is om te gebruik. Die uiteindelike resultate, verkry van die verskillende optimeringsmodelle, is gevolglik gebruik om die optimale langtermyn bate-allokasie vir ‘n Suid-Afrikaanse pensioenfonds op te stel. Die verskillende optimeringsmodelle is ook met mekaar vergelyk om te bepaal of daar ‘n model is wat beter is as die res. Vanuit die resultate was dit duidelik dat ’n portfeulje wat slegs uit Suid-Afrikaanse bates bestaan beter sal presteer as slegs die laaste 10-jaar se data gebruik word om die portefeulje op stel. Hierdie resultate is ook in meeste van die gevalle bevestig deur middel van hipotese toetse. Deur gebruik te maak van die afgelope 20-jaar se data om die portefeuljes op te stel, het gemengde resultate gelewer, terwyl die afgelope 30-jaar se data in meeste van die gevalle ’n internasionaal gediversifiseerde portefeulje as die beter portefeulje uitgewys het. In ’n vergelyking van die verskillende optimeringsmodelle is die “Historical Resampled Mean Variance” model duidelik as die beter model uitgewys. Hierdie model het die hoogste waarskynlikheid behaal om die vasgstelde maatstafportefeuljes uit te presteer. Die resultate het ook gedui op die 20-jaar periode as die beste data periode om te gebruik as die optimale portfeulje opgestel word.
66

South African security market imperfections

Jooste, Dirk 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2006. / In recent times many theories have surfaced posing challenging threats to the Efficient Market Hypothesis. We are entering an exciting era of financial economics fueled by the urge to have a better understanding of the intricate workings of financial markets. Many studies are emerging that investigate the relationship between stock market predictability and efficiency. This paper studies the existence of calendar-based patterns in equity returns, price momentum and earnings momentum in the South African securities market. These phenomena are commonly referred to in the literature as security market imperfections, financial market puzzles and market anomalies. We provide evidence that suggests that they do exist in the South African context, which is consistent with findings in various international markets. A vast number of papers on the subject exist in the international arena. However, very few empirical studies on the South African market can be found in the public domain. We aim to contribute to the literature by investigating the South African case.
67

Aspects of some exotic options

Theron, Nadia 12 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2007. / The use of options on various stock markets over the world has introduced a unique opportunity for investors to hedge, speculate, create synthetic financial instruments and reduce funding and other costs in their trading strategies. The power of options lies in their versatility. They enable an investor to adapt or adjust her position according to any situation that arises. Another benefit of using options is that they provide leverage. Since options cost less than stock, they provide a high-leverage approach to trading that can significantly limit the overall risk of a trade, or provide additional income. This versatility and leverage, however, come at a price. Options are complex securities and can be extremely risky. In this document several aspects of trading and valuing some exotic options are investigated. The aim is to give insight into their uses and the risks involved in their trading. Two volatility-dependent derivatives, namely compound and chooser options; two path-dependent derivatives, namely barrier and Asian options; and lastly binary options, are discussed in detail. The purpose of this study is to provide a reference that contains both the mathematical derivations and detail in valuating these exotic options, as well as an overview of their applicability and use for students and other interested parties.
68

The implementation of noise addition partial least squares

Moller, Jurgen Johann 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2009. / When determining the chemical composition of a specimen, traditional laboratory techniques are often both expensive and time consuming. It is therefore preferable to employ more cost effective spectroscopic techniques such as near infrared (NIR). Traditionally, the calibration problem has been solved by means of multiple linear regression to specify the model between X and Y. Traditional regression techniques, however, quickly fail when using spectroscopic data, as the number of wavelengths can easily be several hundred, often exceeding the number of chemical samples. This scenario, together with the high level of collinearity between wavelengths, will necessarily lead to singularity problems when calculating the regression coefficients. Ways of dealing with the collinearity problem include principal component regression (PCR), ridge regression (RR) and PLS regression. Both PCR and RR require a significant amount of computation when the number of variables is large. PLS overcomes the collinearity problem in a similar way as PCR, by modelling both the chemical and spectral data as functions of common latent variables. The quality of the employed reference method greatly impacts the coefficients of the regression model and therefore, the quality of its predictions. With both X and Y subject to random error, the quality the predictions of Y will be reduced with an increase in the level of noise. Previously conducted research focussed mainly on the effects of noise in X. This paper focuses on a method proposed by Dardenne and Fernández Pierna, called Noise Addition Partial Least Squares (NAPLS) that attempts to deal with the problem of poor reference values. Some aspects of the theory behind PCR, PLS and model selection is discussed. This is then followed by a discussion of the NAPLS algorithm. Both PLS and NAPLS are implemented on various datasets that arise in practice, in order to determine cases where NAPLS will be beneficial over conventional PLS. For each dataset, specific attention is given to the analysis of outliers, influential values and the linearity between X and Y, using graphical techniques. Lastly, the performance of the NAPLS algorithm is evaluated for various
69

Modelling market risk with SAS Risk Dimensions : a step by step implementation

Du Toit, Carl 03 1900 (has links)
Thesis (MComm (Statistics and Actuarial Science))--University of Stellenbosch, 2005. / Financial institutions invest in financial securities like equities, options and government bonds. Two measures, namely return and risk, are associated with each investment position. Return is a measure of the profit or loss of the investment, whilst risk is defined as the uncertainty about return. A financial institution that holds a portfolio of securities is exposed to different types of risk. The most well-known types are market, credit, liquidity, operational and legal risk. An institution has the need to quantify for each type of risk, the extent of its exposure. Currently, standard risk measures that aim to quantify risk only exist for market and credit risk. Extensive calculations are usually required to obtain values for risk measures. The investments positions that form the portfolio, as well as the market information that are used in the risk measure calculations, change during each trading day. Hence, the financial institution needs a business tool that has the ability to calculate various standard risk measures for dynamic market and position data at the end of each trading day. SAS Risk Dimensions is a software package that provides a solution to the calculation problem. A risk management system is created with this package and is used to calculate all the relevant risk measures on a daily basis. The purpose of this document is to explain and illustrate all the steps that should be followed to create a suitable risk management system with SAS Risk Dimensions.
70

Non-parametric volatility measurements and volatility forecasting models

Du Toit, Cornel 03 1900 (has links)
Assignment (MComm)--Stellenbosch University, 2005. / ENGLISH ABSTRACT: Volatilty was originally seen to be constant and deterministic, but it was later realised that return series are non-stationary. Owing to this non-stationarity nature of returns, there were no reliable ex-post volatility measurements. Subsequently, researchers focussed on ex-ante volatility models. It was only then realised that before good volatility models can be created, reliable ex-post volatility measuremetns need to be defined. In this study we examine non-parametric ex-post volatility measurements in order to obtain approximations of the variances of non-stationary return series. A detailed mathematical derivation and discussion of the already developed volatility measurements, in particular the realised volatility- and DST measurements, are given In theory, the higher the sample frequency of returns is, the more accurate the measurements are. These volatility measurements referred to above, however, all have short-comings in that the realised volatility fails if the sample frequency becomes to high owing to microstructure effects. On the other hand, the DST measurement cannot handle changing instantaneous volatility. In this study we introduce a new volatility measurement, termed microstructure realised volatility, that overcomes these shortcomings. This measurement, as with realised volatility, is based on quadratic variation theory, but the underlying return model is more realistic. / AFRIKAANSE OPSOMMING: Volatiliteit is oorspronklik as konstant en deterministies beskou, dit was eers later dat besef is dat opbrengste nie-stasionêr is. Betroubare volatiliteits metings was nie beskikbaar nie weens die nie-stasionêre aard van opbrengste. Daarom het navorsers gefokus op vooruitskattingvolatiliteits modelle. Dit was eers op hierdie stadium dat navorsers besef het dat die definieering van betroubare volatiliteit metings 'n voorvereiste is vir die skepping van goeie vooruitskattings modelle. Nie-parametriese volatiliteit metings word in hierdie studie ondersoek om sodoende benaderings van die variansies van die nie-stasionêre opbrengste reeks te beraam. 'n Gedetaileerde wiskundige afleiding en bespreking van bestaande volatiliteits metings, spesifiek gerealiseerde volatiliteit en DST- metings, word gegee. In teorie salopbrengste wat meer dikwels waargeneem word tot beter akkuraatheid lei. Bogenoemde volatilitieits metings het egter tekortkominge aangesien gerealiseerde volatiliteit faal wanneer dit te hoog raak, weens mikrostruktuur effekte. Aan die ander kant kan die DST meting nie veranderlike oombliklike volatilitiet hanteer nie. Ons stel in hierdie studie 'n nuwe volatilitieits meting bekend, naamlik mikro-struktuur gerealiseerde volatiliteit, wat nie hierdie tekortkominge het nie. Net soos met gerealiseerde volatiliteit sal hierdie meting gebaseer wees op kwadratiese variasie teorie, maar die onderliggende opbrengste model is meer realisties.

Page generated in 0.073 seconds