• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 462
  • 32
  • 16
  • 16
  • 15
  • 14
  • 14
  • 14
  • 14
  • 14
  • 13
  • 13
  • 10
  • 6
  • 6
  • Tagged with
  • 683
  • 683
  • 142
  • 141
  • 115
  • 89
  • 86
  • 57
  • 55
  • 49
  • 49
  • 40
  • 38
  • 38
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
401

Impact of exogenous reinfection on TB infection in a genetically susceptible population.

Mwangi, Wangari Isaac. 17 December 2013 (has links)
In this study we investigated the impact of exogenous reinfection on genetically resistant and genetically sensitive sub populations. We qualitatively analysed the dynamics of TB by assuming that TB is transmitted in two ways namely homogeneous and heterogeneous modes of transmission. Analytically, we computed the fundamental thresholds used to measure disease persistence; the basic reproduction number R₀; and found that the exogenous reinfection parameters do not appear in the basic reproduction number. Hence, basic reproduction number derived in presence of exogenous reinfection does not adequately predict the course of a TB epidemic. We obtained the exogenous reinfection threshold which indicated that exogenous reinfection complicates TB dynamics. Both analytical and simulation results disclosed that when exogenous reinfection is above exogenous reinfection threshold TB dynamics were governed by a backward bifurcation implying TB may continue to invade the population despite basic reproduction number being less than one. We computed critical value of basic reproduction numbers Rᴄ and found that TB can only be eradicated if basic reproduction number is reduced below critical value Rc. Furthermore, we incorporated TB therapy in heterogeneous model among individuals with clinically active TB and performed sensitivity and uncertainty analysis using Latin Hypercube Sampling. The sensitivity and uncertainty results showed that transmission rates, reactivation rates and proportion that is genetically resistant greatly infuenced outcome variables of our TB model. / Thesis (M.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2013.
402

Mathematical analysis of tuberculosis vaccine models and control stategies.

Sithole, Hloniphile. 20 October 2014 (has links)
The epidemiological study of tuberculosis (TB) has been ongoing for several decades, but the most effective control strategy is yet to be completely understood. The basic reproduction number, R₀, has been found to be plausible indicator for TB control rate. The R₀ value is the average number of secondary TB cases produced by a typical infective individual in a completely susceptible population during its entire infectious period. In this study we develop two SEIR models for TB transmission; one involving treatment of active TB only, with the second incorporating both active TB treatment and post-exposure prophylaxis (PEP) treatment for latent TB. Using the next generation matrix method we obtain R₀. We determine the disease free equilibrium (DFE) point and the endemic equilibrium (EE) point. Global stability conditions of DFE are determined using the Castillo-Chavez theorem. Through model analysis of the reproduction number, R₀, we find that for R₀ < 1, the infection will die out. The value of R₀ > 1 implies that the disease will spread within the population. Through stability analysis, we show that the model exhibits backward bifurcation, a phenomenon allowing multiple stable states for fixed model parameter value. MATLAB ode45 solver was used to simulate the model numerically. Using the Latin Hypercube Sampling technique the model is sensitive to treatment and disease transmission parameters, suggesting that to control the disease, more emphasis should be placed on treatment and on reducing TB transmission. For the second model, which incorporated treatment with post-exposure prophylaxis for latently infected individuals, by means of simulations, we found that treatment of latently infected individuals may reduce R₀. Numerical simulations on the latter model also showed that it may be better to introduce a hybrid of active treatment and post-exposure treatment of the latent class. The force of infection was found to reduce when this hybrid control strategy is present. Contour plots and PRCC values highlighted the important parameters that influence the size of the Infective class. The implications of these findings are that TB control measures should emphasise on treatment. Our simplified models assume that there is homogeneous mixing. The model used have not been validated against empirical data. / M.Sc. University of KwaZulu-Natal, Pietermaritzburg 2014.
403

Comparing feature selection algorithms using microarray data

Law, Timothy Tao Hin 22 December 2008 (has links)
In this thesis study, three different feature selection methods, LASSO, SLR, and SMLR, were tested and compared using microarray fold change data. Two real datasets were used to first investigate and compare the ability of the algorithms in selecting feature genes on data under two conditions. It was found that SMLR was quite sensitive to its parameter, and was more accurate in selecting differentially expressed genes when compared to SLR and LASSO. In addition, the model coefficients generated by SMLR had a close relationship with the magnitude of fold changes. Also, SMLR's ability in selecting differentially expressed genes with data that had more than two conditions was shown to be successful. The results from simulation experiments agreed with the results from the real dataset experiments. Additionally, it was found that different proportions of differentially expressed genes in the data did not affect the performance of LASSO and SLR, but the number of genes selected by SMLR increased with the proportion of regulated genes. Also, as the number of replicates used to build the model increased, the number of genes selected by SMLR increased. This applied to both correctly and incorrectly selected genes. Furthermore, it was found that SMLR performed the best in identifying future treatment samples.
404

The circular chromatic number of hypergraphs

Shepherd, Laura Margret Diane January 2005 (has links)
A generalization of the circular chromatic number to hypergraphs is devel-oped. Circular colourings of graphs and hypergraphs are first discussed and it is shown that the circular chromatic number of a graph is the same regard-less of whether the hypergraph or graph definition is used. After presenting a few basic results, some examples of circular chromatic numbers of various families of hypergraphs are given. Subsequently, the concepts of the star chromatic number and the arc chromatic number are introduced. Specif¬ically, both numbers are shown to be equivalent to the circular chromatic number. Finally the relationship between the imbalance of a hypergraph and the circular chromatic number is explored and a classical result of Minty is deduced.
405

Disjoint union-free 3-uniform hypergraphs

Howard, Leah January 2006 (has links)
A k-uniform hypergraph N = (X. B) of order n is a family of k-subsets B of an n-set X. A k-uniform hypergraph 7--L = (X. B) is disjoint union-free (DUF) if all disjoint pairs of elements of B have distinct unions; that is, if for every A, B, C, D E B. A fl B = C f1 D = 0 and A U B =CUD implies {A. B} = {C, D}. DUF families of maximum size have been studied by Erdos and Fiiredi. and in the case k = 3 this maximum size has been conjectured to equal (z). In this thesis, we study DUF 3-uniform hypergraphs with the main goals of presenting evidence to support this conjecture and studying the structures that have conjectured maximum size. If each pair of distinct elements of X is covered exactly A times in B then we call N = (X, B) an (n. k. A)-design. Using a blend of graph- and design-theoretic techniques, we study the DUF (n,. 3. 3)-designs that are the conjectured unique structures having maximum size. Central results of this thesis include substantially improving lower bounds on the maximum size for a large class of n. giving conditions on pair coverage in a DUF 3-uniform hypergraph that force an (n., 3, 3)-design, and providing constructions for DUF 3-uniform hypergraphs from families of DUF hypergraphs with smaller orders. Let. N = (X, B) be a DUF k-uniform hypergraph with the property that 7-t U {E} is not DUF for any k-subset E of X not already in H. Then N is maximally DUF. We introduce the problem of finding the minimum size of maximally DUF families and provide bounds on this quantity for k = 3.
406

Robust designs for the one-way random effects model using Q-estimators

Yang, Xiaolong 04 December 2009 (has links)
Robust statistics is an extension of classical parametric statistics, which provides a safeguard against gross errors in experiments. Effectively, robustness properties of Uhlig's Q-estimators are examined and compared with that. of Rocke's Ai-estimators. In particular, the finite-sample implosion and explosion breakdown points are inves-tigated and introduced into constructing robust designs for the one-way random effects model. Optimal robust designs based on Uhlig's Q-estimation are similar to the ones based on Rocke's M-estimation. Ultimately. robust estimation procedures would provide steady and reliable estimates of model parameters in case of the occurrence of outliers.
407

Implementation of Bayesian methods in the pharmaceutical industry

Grieve, Andrew P. January 1992 (has links)
This thesis is concerned primarily with the practical implementation of Bayesian methodology within the context of the pharmaceutical industry. The implementation includes the development, where appropriate, of analytic approximations to the posterior distributions of interest and graphical methods for mapping prior assumptions to posterior inference. Two critical areas within pharmaceutical research, critical in the sense of the controversy which they have aroused, have been investigated. First, Bayesian methods for the analysis of two-treatment crossover designs which fell in to disfavour in the late 1970's and early 1980's because of the US Food and Drug Administration's published view that the two-treatment two-period design was not the design of first choice if unequivocal evidence of a treatment effect was required were developed. Each type of design considered and for which methods are developed are illustrated with examples from clinical trials which have already been reported in the medical literature. Second, a Bayesian method is developed whose purpose is to classify test compounds into one of several toxicity classes on the basis of an LD50 estimate. The method is generalised to deal with a non-standard LD50 problem related to the prediction of results from a future LD50 experiment. Both of these applications arose out of a practical consultancy session within the context of a statistics group in the chemical/pharmaceutical industry. As part of the methods required for carrying out these analyses the zeros and weights associated with some non-standard orthogonal polynomial are developed as a result of which a new asymptotic expansion of the Behrens-Fisher density is developed. Further applications of the polynomials orthogonal to t-kernels are developed including problems associated with prediction in clinical trials. A FORTRAN program which has been implemented at a laboratory level within the pharmaceutical toxicology department at CIBA-GEIGY in Switzerland is provided SAS programs for a variety of the analyses developed for the two-treatment crossover designs are provided as are SAS programs for determining the zeros and weights of a number of different classes of orthogonal polynomials.
408

Modelling and monitoring of medical time series

Gordon, Kerry January 1986 (has links)
In this thesis we examine several extensions to the dynamic linear model framework, outlined by Harrison and Stevens (1976), in order to adapt these models for use in the on-line analysis of medical time series that arise from routine clinical settings. The situation with which we are most concerned is that where we are monitoring individual patients and wish to detect abrupt changes in the patient's condition as soon as possible. A detailed background to the study and application of dynamic linear models is given, and other techniques for time series monitoring are also discussed when appropriate. We present a selection of specific models that we feel may prove to be of practical use in the modelling and monitoring of medical time series, and we illustrate how these models may be utilized in order to distinguish between a variety of alternative changepoint-types. The sensitivity of these models to the specification of prior information is examined in detail. The medical background to the time series examined requires the development of models and techniques enabling us to analyze generally unequally-spaced time series. We test the performance of the resulting models and techniques using simulated data. We then attempt to build a framework for bivariate time series modelling, allowing, once more, for the possibility of unequally spaced data. In particular, we suggest mechanisms whereby causality and feedback may be introduced into such models. Finally, we report on several applications of this methodology to actual medical time series arising in various contexts including kidney and bone-marrow transplantation and foetal heart monitoring.
409

The viability of Weibull analysis of small samples in process manufacturing

Abughazaleh, Tareq Ali Ibrahim January 2002 (has links)
This research deals with some Statistical Quality Control (SQC) methods, which are used in quality testing. It investigates the problem encountered with statistical process control (SPC) tools when small sample sizes are used. Small sample size testing is a new area of concern especially when using expensive (or large) products, which are produced in small batches (low volume production). Critical literature review and analysis of current technologies and methods in SPC with small samples testing failed to show a conformance with conventional SPC techniques, as the confidence limits for averages and standard deviation are too wide. Therefore, using such sizes will provide unsecured results with a lack in accuracy. The current research demonstrates such problems in manufacturing by using examples, in order to show the lack and the difficulties faced with conventional SPC tools (control charts). Weibull distribution has always shown a clear and acceptable prediction of failure and life behaviour with small sample size batches. Using such distribution enables the accuracy needed with small sample size to be obtained. With small sample control charts generate inaccurate confidence limits, which are low. On the contrary, Weibull theory suggests that using small samples enable achievement of accurate confidence limits. This research highlights these two aspects and explains their features in more depth. An outline of the overall problem and solution point out success of Weibull analysis when Weibull distribution is modified to overcome the problems encountered when small sample sizes are used. This work shows the viability of Weibull distribution to be used as a quality tool and construct new control charts, which will provide accurate result and detect nonconformance and variability with the use of small sample sizes. Therefore, the new proposed Weibull deduction control charts shows a successful replacement of the conventional control chart, and these new charts will compensate the errors in quality testing when using small size samples.
410

Empirical Bayes methods for DNA microarray data /

Lönnstedt, Ingrid, January 2005 (has links)
Diss. (sammanfattning) Uppsala : Univ., 2005. / Härtill 4 uppsatser.

Page generated in 0.1207 seconds