• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 461
  • 32
  • 16
  • 16
  • 15
  • 14
  • 14
  • 14
  • 14
  • 14
  • 13
  • 13
  • 10
  • 6
  • 6
  • Tagged with
  • 682
  • 682
  • 142
  • 141
  • 115
  • 89
  • 86
  • 57
  • 55
  • 49
  • 49
  • 40
  • 38
  • 38
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Bayesian model assessment for stochastic epidemic models

Alharthi, Muteb January 2016 (has links)
Acrucial practical advantage of infectious diseases modelling as a public health tool lies in its application to evaluate various disease-control policies. However, such evaluation is of limited use, unless a sufficiently accurate epidemic model is applied. If the model provides an adequate fit, it is possible to interpret parameter estimates, compare disease epidemics and implement control procedures. Methods to assess and compare stochastic epidemic models in a Bayesian framework are not well-established, particularly in epidemic settings with missing data. In this thesis, we develop novel methods for both model adequacy and model choice for stochastic epidemic models. We work with continuous time epidemic models and assume that only case detection times of infected individuals are available, corresponding to removal times. Throughout, we illustrate our methods using both simulated outbreak data and real disease data. Data augmented Markov Chain Monte Carlo (MCMC) algorithms are employed to make inference for unobserved infection times and model parameters. Under a Bayesian framework, we first conduct a systematic investigation of three different but natural methods of model adequacy for SIR (Susceptible-Infective-Removed) epidemic models. We proceed to develop a new two-stage method for assessing the adequacy of epidemic models. In this two stage method, two predictive distributions are examined, namely the predictive distribution of the final size of the epidemic and the predictive distribution of the removal times. The idea is based onlooking explicitly at the discrepancy between the observed and predicted removal times using the posterior predictive model checking approach in which the notion of Bayesian residuals and the and the posterior predictive p−value are utilized. This approach differs, most importantly, from classical likelihood-based approaches by taking into account uncertainty in both model stochasticity and model parameters. The two-stage method explores how SIR models with different infection mechanisms, infectious periods and population structures can be assessed and distinguished given only a set of removal times. In the last part of this thesis, we consider Bayesian model choice methods for epidemic models. We derive explicit forms for Bayes factors in two different epidemic settings, given complete epidemic data. Additionally, in the setting where the available data are partially observed, we extend the existing power posterior method for estimating Bayes factors to models incorporating missing data and successfully apply our missing-data extension of the power posterior method to various epidemic settings. We further consider the performance of the deviance information criterion (DIC) method to select between epidemic models.
202

Likelihood ratios in asymptotic statistical theory

Leroux, Brian Gilbert January 1985 (has links)
This thesis deals with two topics in asymptotic statistics. A concept of asymptotic optimality for sequential tests of statistical hypotheses is introduced. Sequential Probability Ratio Tests are shown to have asymptotic optimality properties corresponding to their usual optimality properties. Secondly, the asymptotic power of Pearson's chi-square test for goodness of fit is derived in a new way. The main tool for evaluating asymptotic performance of tests is the likelihood ratio of two hypotheses. In situations examined here the likelihood ratio based on a sample of size ⁿ has a limiting distribution as ⁿ → ∞ and the limit is also a likelihood ratio. To calculate limiting values of various performance criteria of statistical tests the calculations can be made using the limiting likelihood ratio. / Science, Faculty of / Statistics, Department of / Graduate
203

A multivariate analysis of shares listed on the Johannesburg Stock Exchange

Visser, Francesca January 1983 (has links)
This thesis examines the usefulness of multivariate statistical techniques to portfolio theory by applying two different multivariate techniques to two separate classificatory problems concerning shares listed on the Johannesburg Stock Exchange. In Chapter 1 the two techniques and two classificatory problems are introduced and their context within the general structure of portfolio theory is explained. Chapter 2 gives a theoretical overview of the first technique used, namely Factor Analysis. Chapters 3 and 4 discuss the application of factor analytic techniques to shares listed on the Johannesburg Stock Exchange. Chapter 5 gives a theoretical overview of Multiple Discriminant Analysis, the second multivariate technique used. Chapter 6 represents a survey of previous applications of Multiple Discriminant Analysis in the field of Finance, while Chapters 7 and 8 discuss the application of this technique to shares listed on the Johannesburg Stock Exchange. Finally, Chapter 9 gives a brief summary of the main conclusions in this thesis.
204

Multiscale image representation in deep learning

Stander, Jean-Pierre January 2020 (has links)
Deep learning is a very popular field of research which can input a variety of data types [1, 16, 30]. It is a subfield of machine learning consisting of mostly neural networks. A challenge which is very commonly met in the training of neural networks, especially when working with images is the vast amount of data required. Because of this various data augmentation techniques have been proposed to create more data at low cost while keeping the labelling of the data accurate [65]. When a model is trained on images these augmentations include rotating, flipping and cropping the images [21]. An added advantage of data augmentation is that it makes the model more robust to rotation and transformation of an object in an image [65]. In this mini-dissertation we investigate the use of the Discrete Pulse Transform [54, 2] decomposition algorithm and its Discrete Pulse Vectors (DPV) [17] as data augmentation for image classification in deep learning. The DPVs is used to extract features from the image. A convolutional neural network is trained on the original and augmented images and a comparison made to a convolutional neural network only trained on the unaugmented images. The purpose of the models implemented is to correctly classify an image as either a cat or dog. The training and testing accuracy of the two approaches are similar. The loss of the model using the proposed data augmentation is improved. When making use of probabilities predicted by the model and determining a custom cut off to classify an image into one of the two classes, the model trained on using the proposed augmentation outperforms the model trained without the proposed data augmentation. / Mini Dissertation (MSc (Advanced Data Analytics))--University of Pretoria, 2020. / The financial assistance of the National Research Foundation (NRF) towards this research is hereby acknowledged. Opinions expressed and conclusions arrived at, are those of the author and are not necessarily to be attributed to the NRF. / Statistics / MSc (Advanced Data Analytics) / Unrestricted
205

Bayesian inference of lower percentiles within strength modeling

Van Zyl, Christine Elizabeth January 2021 (has links)
The interest in the study and modeling of the strength within material science has continuously been of interest within engineering and the built environment, with the Weibull distribution frequently being the model of choice in this area. Oftentimes there is a high cost involved with obtaining enough samples to perform suitable inference, and a Bayesian approach has exhibited suitable inference based on smaller samples for parameter- and confidence interval estimation. This study considers alternative Weibull candidates from a general Weibull family for the data likelihood candidates, and noninformative prior choices for parameters of these considered members are derived for their corresponding parameters. In addition to this, some previously unconsidered priors are introduced for consideration with the standard Weibull model. An introductory simulation study is presented and the effect of the alternative prior choices for the standard two-parameter Weibull model is investigated. Real data analysis rounds off the contributions of this study. / Mini Dissertation (MSc (Advanced Data Analytics))--University of Pretoria, 2021. / DSTNRF-SAMRC South African Statistical Association / Statistics / MSc (Advanced Data Analytics) / Restricted
206

On weighted Poisson distributions and processes, with associated inference and applications

Mijburgh, Philip Albert January 2020 (has links)
In this thesis, weighted Poisson distributions and processes are investigated, as alternatives to Poisson distributions and processes, for the modelling of discrete data. In order to determine whether the use of a weighted Poisson distribution can be theoretically justified over the Poisson, goodness-of-fit tests for Poissonity are examined. In addition to this research providing an overarching review of the current Poisson goodness-of-fit tests, it is also examined how these tests perform when the alternative distribution is indeed realised from a weighted Poisson distribution. Similarly, a series of tests are discussed which can be used to determine whether a sample path is realised from a homogeneous Poisson process. While weighted Poisson distributions and processes have received some attention in the literature, the list of potential weight functions with which they can be augmented is limited. In this thesis 26 new weight functions are presented and their statistical properties are derived in closed-form, both in terms of distributions and processes. These new weights allow, what were already very flexible models, to be applied to a range of new practical situations. In the application sections of the thesis, the new weighted Poisson models are applied to many different discrete datasets. The datasets originate from a wide range of industries and situations. It is shown that the new weight functions lead to weighted Poisson distributions and processes that perform favourably in comparison to the majority of current modelling methodologies. It is demonstrated that the weighted Poisson distribution can not only model data from Poisson, binomial and negative binomial distributions, but also some more complex distributions like the generalised Poisson and COM-Poisson. / Thesis (PhD (Mathematical Statistics))--University of Pretoria, 2020 / UP Postgraduate Research Support Bursary / UP Postgraduate Study Abroad Bursary / STATOMET Bursary. / SASA/NRF Academic Statistics Bursary / Statistics / PhD (Mathematical Statistics) / Unrestricted
207

A simulation study of the effect of target motion on sighting estimates of minke whale population density

Basson, Marinelle January 1983 (has links)
Line transect methods used to estimate population density assume stationarity of targets. Violation of this assumption leads to overestimation of the true density. A simulation study based on a hazard-rate model is used to assess the resulting bias. The model is calibrated to generate sighting data resembling real data from minke whale sighting surveys. The procedure currently used to calculate a corrected negative exponential density estimate from sighting data is duplicated using simulated data. The resulting estimates are compared to the true population density determined by the simulation. Results reveal that in the case considered, the method of calculating the g(O) factor (which corrects for the fact that all animals on the trackline are not sighted) leads to a greater degree of overestimation than the effect of target motion at 3 knots. Shortcomings of the model are pointed out and possible improvements suggested. It is also suggested that further research be focused initially on the calculation of the g(O) correction factor rather than on effects of target motion.
208

The address sort and other computer sorting techniques

Underhill, Leslie G January 1971 (has links)
Originally this project was to have been a feasibility study of the use of computers in the library. It soon became clear that the logical place in the library at which to start making use of the computer was the catalogue. Once the catalogue was in machine-readable form it would be possible to work backwards to the book ordering and acquisitions system and forwards to the circulation and book issue system. One of the big advantages in using the computer to produce the catalogue would be the elimination of the "skilled drudgery" of filing. Thus vast quantities of data would need to be sorted. And thus the scope of this project was narrowed down from a general feasibility study, firstly to a study of a particular section of the library and secondly to one particularly important aspect of that section - that of sorting with the aid of the computer. I have examined many, but by no means all computer sorting techniques, programmed them in FORTRAN as efficiently as I was able, and compared their performances on the IBM 1130 computer of the University of Cape Town. I have confined myself to internal sorts, i.e. sorts that take place in core. This thesis stops short of applying the best of these techniques to the library. I intend however to do so, and to work back to the original scope of my thesis.
209

An examination of heuristic algorithms for the travelling salesman problem

Höck, Barbar Katja January 1988 (has links)
The role of heuristics in combinatorial optimization is discussed. Published heuristics for the Travelling Salesman Problem (TSP) were reviewed and morphological boxes were used to develop new heuristics for the TSP. New and published heuristics were programmed for symmetric TSPs where the triangle inequality holds, and were tested on micro computer. The best of the quickest heuristics was the furthest insertion heuristic, finding tours 3 to 9% above the best known solutions (2 minutes for 100 nodes). Better results were found by longer running heuristics, e.g. the cheapest angle heuristic (CCAO), 0-6% above best (80 minutes for 100 nodes). The savings heuristic found the best results overall, but took more than 2 hours to complete. Of the new heuristics, the MST path algorithm at times improved on the results of the furthest insertion heuristic while taking the same time as the CCAO. The study indicated that there is little likelihood of improving on present methods unless a fundamental new approach is discovered. Finally a case study using TSP heuristics to aid the planning of grid surveys was described.
210

A time series approach to the monetary sector of the South African economy

Dietzsch, Carl Heinrich January 1978 (has links)
Bibliography: p. 111-114. / This thesis provides an investigation of the applicability of time series analysis to the process of economic model building. Chapter l explains the position of the Box-Jenkins approach to time series analysis in relation to other techniques of analysis. In Chapters 2 and 3 the theory of model building is discussed. In Chapter 4 an econometric model is analysed in detail from a time series approach.

Page generated in 0.1242 seconds