51 |
Statistical-grey consistent grey differential equation modellingCui, Yan Hong January 2007 (has links)
Includes abstract.
Includes bibliographical references (p. 152-156).
|
52 |
Interval-valued Uncertainty Capability Indices with South African Industrial ApplicationsGyekye, Kwame Boakye January 2014 (has links)
Includes bibliographical references. / Since the advent of statistical quality control and process capability analysis, its study and application has gained tremendous attention both in academia and industry. This attention is due to its ability to describe the capability of a complex process adequately, simply (i.e. using a unitless index) and also in some instances to compare different manufacturing processes. However, the application of statistical quality control has come under intense criticism, notably in one car manufacturing industry where the actual number of non-conforming units considerably exceeded expectation, although probabilistic control measures were in place. This failure led to a large recall of their vehicles and also left a dent on the image of the company. One of the reasons for this unfortunate instance is that in classical quality control measures, human judgement is ignored and since in process engineering there is considerable expert intuition in decision making, this element cannot be undermined. Hence the research study applies the uncertainty theory proposed by Baoding Liu (2007) to enable us to incorporate human judgement into process capability analysis. The major findings of the thesis is that the uncertain process capability indices under an uncertainty environment are interval-valued and their relevant characteristics. The study further developed the "sampling" uncertainty distributions and thus the "sampling" impacts on the newly defined uncertain process capability indices under Liu's uncertain normal distribution assumptions. In order to reach the main purpose of the thesis, a thoroughgoing literature review on probabilistic process capability indices is necessary.
|
53 |
Comparison of ridge and other shrinkage estimation techniquesVumbukani, Bokang C January 2006 (has links)
Includes bibliographical references. / Shrinkage estimation is an increasingly popular class of biased parameter estimation techniques, vital when the columns of the matrix of independent variables X exhibit dependencies or near dependencies. These dependencies often lead to serious problems in least squares estimation: inflated variances and mean squared errors of estimates unstable coefficients, imprecision and improper estimation. Shrinkage methods allow for a little bias and at the same time introduce smaller mean squared error and variances for the biased estimators, compared to those of unbiased estimators. However, shrinkage methods are based on the shrinkage factor, of which estimation depends on the unknown values, often computed from the OLS solution. We argue that the instability of OLS estimates may have an adverse effect on performance of shrinkage estimators. Hence a new method for estimating the shrinkage factors is proposed and applied on ridge and generalized ridge regression. We propose that the new shrinkage factors should be based on the principal components instead of the unstable OLS estimates.
|
54 |
Geographically weighted regression and an extensionMiller, Karen M January 2008 (has links)
Includes abstract.
Includes bibliographical references (leaves [72]-75).
|
55 |
Modelling relationships between clinical markers of the Human Immunodeficiency Virus disease in a South African populationGumedze, Freedom N January 1999 (has links)
Bibliography: pages 91-99. / This study investigated relationships between the CD4 count and other clinical markers of the HIV disease, total lymphocyte count and viral load, in a South African population. The CD4 count has been an important clinical marker of disease progression in HIV infected individuals,,-and has been the focus of many studies in developed countries. Most of the studies reported in the literature have been done using data from well-defined cohorts of HIV patients. Similar studies in Africa do not appear to have been done. This study used clinical records of HIV infected individuals attending the Somerset Hospital HIV Clinic, over the period 1984-97, to study the relationship between the CD4 count and total lymphocyte count. From a practical perspective this relationship is important in South Africa for two reasons. Firstly, a majority of the HIV infected population is poor and can not afford the higher costs associated with the measurement of the CD4 count instead of the total lymphocyte count. .. Secondly, in many small clinics or hospitals in South Africa the equipment for measuring the CD4 count is generally not available but the equipment for measuring the total lymphocyte count is widely available.
|
56 |
Forecasting stock price movements using neural networksRank, Christian January 2006 (has links)
Includes bibliographical references (p. 99-101). / The prediction of security prices has shown to be one of the most important but most difficult tasks in financial operations. Linear approaches failed to model the non-linear behaviour of markets and non-linear approaches turned out to posses too many constraints. Neural networks seem to be a suitable method to overcome these problems since they provide algorithms which process large sets of data from a non-linear context and yield thorough results. The first problem addressed by this research paper is the applicability of neural networks with respect to markets as a tool for pattern recognition. It will be shown that markets posses the necessary requirements for the use of neural networks, i.e. markets show patterns which are exploitable.
|
57 |
Quality control charts under random fuzzy measurementsThoutou, Sayi Mbani January 2007 (has links)
Includes bibliographical references. . / We consider statistical process control charts as tools that statistical process control utilizes for monitoring changes; identifying process variations and their causes in industrial processes (manufacturing processes) and which help manufacturers to take the appropriate action, rectify problems or improve manufacturing processes so as to produce good quality products. As an essential tool, researchers have always paid attention to the development of process control charts. Also, the sample sizes required for establishing control charts are often under discussion depending on the field of study. Of late, the problem of Fuzziness and Randomness often brought into modern manufacturing processes by the shortening product life cycles and diversification (in product designs, raw material supply etc) has compelled researchers to invoke quality control methodologies in their search for high customer satisfaction and better market shares (Guo et al 2006). We herein focus our attention on small sample sizes and focus on the development of quality control charts in terms of the Economic Design of Quality Control Charts; based on credibility measure theory under Random Fuzzy Measurements and Small Sample Asymptotic Distribution Theory. Economic process data will be collected from the study of Duncan (1956) in terms of these new developments as an illustrative example. or/Producer, otherwise they are undertaken with respect to the market as a whole. The techniques used for tackling the complex issues are diverse and wide-ranging as ascertained from the existing literature on the subject. The global ideology focuses on combining two streams of thought: the production optimisation and equilibrium techniques of the old monopolistic, cost-saving industry and; the new dynamic profit-maximising and risk-mitigating competitive industry. Financial engineering in a new and poorly understood market for electrical power must now take place in conjunction with - yet also constrained by - the physical production and distribution of the commodity.
|
58 |
Multivariate Extreme Value Theory with an application to climate data in the Western Cape ProvinceBhagwandin, Lipika January 2017 (has links)
An understanding of past and current weather conditions can aid in identifying trends and changes that have occurred in weather patterns. This is particularly important as certain weather conditions can have both a positive and a negative impact on various activities in any region. Together with an ever-changing climate it has become markedly noticeable that there is an upward trend in extreme weather conditions. The aim of this study is to evaluate the efficacy of univariate and multivariate extreme value theory models on climate data in the Western Cape province of South Africa. Data collected since 1965 from five weather stations viz. Cape Town International Airport, George Airport, Langebaanweg, Plettenberg Bay and Vredendal was modelled and analysed. In the multivariate analysis, multiple variables are modelled at a single location. Block maxima, threshold excess and point process approaches are used on the weather data, specifically on rainfall, wind speed and temperature maxima. For the block maxima approach, the data is grouped in n-length blocks and the maxima of each block form the dataset to be modelled. The threshold excess and point process approaches use a suitably chosen threshold whereby observations above the threshold are considered as extreme and therefore form the dataset used in the models. Under the threshold excess approach, only observations that exceed the threshold in all components are able to be modelled, whereas exceedances in one and all components simultaneously can be handled by the point process approach. While the probability of experiencing high levels of rainfall, wind speed and temperature individually and jointly are low, a few conclusions were drawn based on the comparison of the performance of the models. It was found that models under the block maxima approach did not perform well in modelling the weather variables at the five stations in both the univariate and multivariate case as many useful observations are discarded. The threshold excess and point process approaches performed better in modelling the weather extremes. Similar results are achieved between these two approaches in the univariate analysis and there is no outright distinction that favours one approach over the other. In terms of the multivariate case, which is restricted to two variables, the point process approach was able to provide estimates with increased accuracy as in many cases there are more extremes in one component individually than in both components. Specifically, the negative logistic and negative bilogistic models suitably capture the dependence structure between maximum wind speed versus maximum rain- fall and maximum wind speed versus maximum temperature at the five weather stations. The results from the point process models showed very weak dependence between wind speed and rainfall maxima as well as between wind speed and temperature maxima which may warrant the inclusion of additional variables into the analysis and even a spatial component which is not included in this study.
|
59 |
The South African state old age pension : a reconsideration of the effects of the state old age pension on the living arrangements of the elderly in South AfricaBustin, Lara January 2006 (has links)
Includes bibliographical references.
|
60 |
Robust beta estimation and applicationsKhan, Mohamed Rezah January 2003 (has links)
Bibliography: leaves 92-94. / Modern portfolio theory was developed by Harry Markowitz more than forty years ago and is now considered to be an indispensable tool in portfolio construction. Sharpe introduced the index models as a simplification of the original Markowitz formulation, as this required fewer parameters to be estimated. One of the premises underlying the Sharpe Index model was that the returns of shares and market proxies followed a normal distribution and under this assumption, model parameters could best be estimated using Ordinary Least Squares (OLS) regression. More recent empirical evidence has however cast doubt over the assumption of normality and has suggested that market returns tend to be non-normal. If the assumption of normality is no longer upheld then OLS or Maximum Likelihood Estimates may no longer produce the best estimates of parameters and hence may compromise optimal portfolio constructions. In addition to the parameter estimation problems, at the time of the initial formulation of the portfolio models, managers were not allowed to participate in short sales (selling a share one does not own). This practice is now quite common in most developed markets and any portfolio formulation model needs to be generalised to allow for this. Empirical studies have shown that share returns and the returns of market proxies do not follow a normal distribution but rather seem to follow a skew distribution and has long tails. It is the aim of this thesis to explore robust regression procedures which should be an improvement of OLS when the data does not come from a normal distribution. The robust regression procedures will then be used to estimate the parameters used in the Sharpe Index models and examine whether or not they aid in the portfolio construction process. The robust procedures will be applied to the classical portfolio formulations as well as the generalised models.
|
Page generated in 0.0875 seconds