• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 15
  • 5
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 282
  • 282
  • 101
  • 98
  • 81
  • 67
  • 67
  • 45
  • 39
  • 38
  • 37
  • 37
  • 35
  • 32
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Fitting some Families of Contagious Distributions to Biological and Accident Data

Lee, Yung-sung 01 May 1971 (has links)
Four families of contagious distributions--generalized Poisson distributions, generalized binomial distributions, generalized Pascal distributions, and generalized log-zero distributions--are investigated in this thesis. The family of generalized Poisson distributions contains five distributions: the Neyman Type A, the "Short," the Poisson binomial, the Poisson Pascal, and the negative binomial. The family of generalized binomial distributions contains eight distributions: the binomial Poisson, the binomial binomial, the binomial Pascal, the binomial log-zero, the Poisson with zeros, the binomial with zeros, the Pascal with zeros, and the log-zero with zeros. The family of generalized Pascal distributions contains four distributions: the Pascal Poisson, the Pascal binomial, the Pascal Pascal, and the Pascal log-zero. The family of generalized log-zero distributions contains four distributions: the log-zero Poisson, the log-zero binomial, the log-zero Pascal, and the log-zero log-zero. For each family of contagious distributions, the common probability generating function based on a biological model is derived by application of Feller's compound distribution theorem and Gurland's generalized distribution terminology. The common recurrence relation and the common factorial moments or cumulants are derived from the common probability generating function by using the successive differentiation method. Then for each distribution within this family, the particular probability generating function, recurrence relation, and factorial moments or cumulants are easily obtained from common ones. The equations of factorial moments or cumulants are solved. The maximum likelihood equations are derived for some distributions which have been shown to provide a good or excellent moment fitting. These equations are solved by an iteration procedure, Except for the Neyman Type A distribution and the "Short" distribution in which the maximum likelihood equations are derived from the probability generating functions and solved by the method of scoring, the maximum likelihood equations a re derived from the probability functions and solved by the Newton-Raphson method. Forty sets of biological and accident data classified into five types have been collected from various sources. A Fortran program has been written for fitting each distribution and a numerical example is given to illustrate the fitting procedure. In comparing the fits among these distributions, the chi-square goodness- of-fit values have been calculated and tabulated. The results suggest that the binomial distribution with zeros and the Pascal distribution with zeros be used if one is to describe the empirical data arising from populations having a contagious character. This is not only due to the fact that the two distributions have provided better fits to all five types of data, but also the fact that their maximum likelihood estimate procedures have no common disadvantages of other distributions. These disadvantages are that not every moment estimate can allow the iteration process to converge and that the probabilities must be recalculated after each iteration.
32

Spectral Analysis of Time-Series Associated with Control Systems

Smith, Karl Leland 01 May 1965 (has links)
The progress of science is based to a large degree on experimentation. The scientist, engineer, or researcher is usually interested in the results of a single experiment only to the extent that he hopes to generalize the results to a class of similar experiments associated with an underlying phenomenon. The process by which this is done is called inductive inference and is always subject to uncertainty. The science of statistical inference can be used to make inductive inferences for which the degree of uncertainty can be measure in terms of probability. A second type of inference called deductive inference is conclusive. If the premises are true, deductive inference leads to true conclusions. Proving the theorems of mathematics is an example of deductive inference; while in the empirical sciences, inductive inference is used to find new knowledge. In engineering and physical science, analytical , i.e., deterministic techniques have been developed to provide deductive descriptions of the real world. Sometimes the assumptions required to make deterministic techniques appropriate are too restrictive since no provision is made for stochastic or uncertainty involved in concluding real world situations. In these situations, the science of statistics provides a basis for generalizing the results of experiments associated with the phenomena of interest. In order to make statistical inference sound, the experimenter must decide in advance which factors must be controlled in the experiment. The factors which are unknown or which cannot be controlled must be controlled by the device of randomization. Uncontrolled factors express themselves as experimental error in the experiment. Randomization is used in order to insure that the experimental error satisfies the probability requirements specified in the statistical model for the experiment, thereby making it possible for the experimenter to generalize the results of his experiment using significance and confidence probability statements. Much of statistics is devoted to situations for which experiments are conducted according to schemes of restricted randomization. Therefore, the experimental errors are independent and are assumed to have a common, yet unknown, probability distribution that can be characterized by estimating the mean and the variance. However, there are certain other types of experimental situations for which it is desirable to observe a physical phenomena with the observations ordered in time or space. The resulting observations can be called a time series.The experimental errors of a time series are likely to be correlated. Consequently, if an unknown probability distribution is to be characterized, covariances as well as the respective means and the variances must be estimated. A time series resulting from observation of a given physical phenomena may exhibit dominant deterministic properties if the experiment can be well controlled. Or, the time series may exhibit dominant statistical properties if it is impossible or impractical to isolate and control various influencing factors. Generally an experiment will consist of both deterministic and statistical elements in some degree in a real world situation. The procedures of analysis presented in Chapter III consider the statistical analysis of periodic and aperiodic digital (discrete) time series, in both the time and frequency domains, using Fourier analysis, covariance and correlation analysis, and the estimation of power and cross power spectral density functions. Time ordered observations are important in the analysis of engineering systems. Certain characteristics of engineering systems are discussed in Chapter IV, and the input-output concept of control system engineering introduced. The input-output technique is not limited to control system engineering problems, but may be applicable in other areas of science also. A deterministic method of ascertaining the output performance of an engineering system consists of subjecting the system to a sinusoidal input function of time, and then measuring the output function of time. If the engineering system is linear, the well-developed techniques are available for analysis; but if the system is nonlinear, then more specialized analysis procedures must be developed for specific problems. In a broad sense, the frequency-response approach consists of investigating the output of a linear system to sinusoidal oscillations of the input. If the system of nonlinear, then the frequency-response approach must be modified; one such modification is the describing function technique. These techniques are also discussed in Chapter IV. Under actual experimental conditions, the deterministic approach of subjecting a system to a sinusoidal input function for purposes of analysis is likely to be complicated by nonlinearities of the system and statistical characteristics of the data. The physical characteristics of the data will undoubtedly be obscured by random measuring errors introduced by transducers and recording devices, and uncontrollable environmental and manufacturing influences. Consequently, generalized procedures for analyzing nonlinear systems in the presence of statistical variation are likely to be required to estimate the input-output characteristics if the system is to work with inferential models applied to recorded data. Such procedures are presented in Chapter III and Chapter V. In Chapter V the empirical determination from input-output rocket test data of a deterministic and statistical model for predicting rocket nozzle control system requirements is complicated by the fact that the control system is nonlinear and the nozzle data is non-stationary consisting of both systematic and random variation. The analysis techniques developed are general enough for analysis of other types of nonlinear systems. If the nonlinear effect of coulomb friction can be estimated and the responses are adjusted accordingly, the nozzle system bears a close relationship to a linear second order differential equation consisting of an acceleration times moment of enertia component, a gas dynamic spring component and a viscous friction component. In addition, vibration loading is present in the data. Consequently, estimation of auto correlation and power spectral density functions is used to isolate these vibrations. Analysis of the control system data is also considered in terms of auto correlations, and in terms of a power spectral density functions. Random input functions rather than sinusoidal input functions may be required under more general experimental conditions. Chapter VI numerically illustrates the analysis procedures. The actual rocket test data used in developing the analysis was classified; consequently, only fictitious data are used in this paper to illustrate the procedures. Chapter VIII is concerned with illustrating the procedures of Chapter III utilizing various time series data. The last part of Chapter VII is concerned with estimation of the power spectral function using techniques of multiple regression; i.e., the model of the General Linear Hypothesis. A definite limitation is the model assumption concerning the residual error of the model. The assumption concerning the error of the model can probably be made more tenable by suitable transformation of either the original time series data or the autocovariances. In any even the spectral function developed by assuming the model for the General Linear Hypothesis gives the same spectral function as defined in Chapter III. However, such quantities as the variance, tests of hypotheses and variance of the spectral function can now be estimated, if the assumptions concerning residual error are valid. Chapter VIII summarizes the results of previous chapters.
33

A Test for Determining an Appropriate Model For Accelerated Life Data

Chen, Yuan-Who 01 May 1987 (has links)
The purpose of this thesis was to evaluate a method for testing the appropriateness of accelerated life model. This method is based upon a polynomial approximation. The parameters are estimated and used for testing the appropriateness of the model. An example illustrates the polynomial method. Real data are applied for this method. Comparison with another method demonstrates that the polynomial method is much simpler and has comparable accuracy.
34

The Use of Contingency Table Analysis as a Robust Technique for Analysis of Variance

Chiu, Mei-Eing 01 May 1982 (has links)
The purpose of this paper is to compare Analysis of Variance with Contingency Table Analysis when the data being analyzed do not satisfy Analysis of Variance assumptions. The criteria for comparison are the powers of the Standard variance-ratio and the Chi-square test. The test statistic and powers were obtained by Monte Carlo. 1. Calculate test statistic for each of 100 trials, this process was repeated 12 times. Each time different combination of means and variances were used. 2. Powers were obtained for each of 12 combinations of means and variances. Whether Analysis of Variance or Contingency Table Analysis is a better alternative depends on if we are interested in equality of population means or differences of population variances.
35

Sensitivity Analyses for Tumor Growth Models

Mendis, Ruchini Dilinika 01 April 2019 (has links)
This study consists of the sensitivity analysis for two previously developed tumor growth models: Gompertz model and quotient model. The two models are considered in both continuous and discrete time. In continuous time, model parameters are estimated using least-square method, while in discrete time, the partial-sum method is used. Moreover, frequentist and Bayesian methods are used to construct confidence intervals and credible intervals for the model parameters. We apply the Markov Chain Monte Carlo (MCMC) techniques with the Random Walk Metropolis algorithm with Non-informative Prior and the Delayed Rejection Adoptive Metropolis (DRAM) algorithm to construct parameters' posterior distributions and then obtain credible intervals.
36

Comparing Performance of Gene Set Test Methods Using Biologically Relevant Simulated Data

Lambert, Richard M. 01 December 2018 (has links)
Today we know that there are many genetically driven diseases and health conditions.These problems often manifest only when a set of genes are either active or inactive. Recent technology allows us to measure the activity level of genes in cells, which we call gene expression. It is of great interest to society to be able to statistically compare the gene expression of a large number of genes between two or more groups. For example, we may want to compare the gene expression of a group of cancer patients with a group of non-cancer patients to better understand the genetic causes of that particular cancer. Understanding these genetic causes could potentially lead to improved treatment options. Initially, gene expression was tested on a per gene level for statistical difference. In more recent years, it has been determined that grouping genes together by biological processes into gene sets and comparing groups at the gene set level probably makes more sense biologically. A number of gene set test methods have since been developed. It is critically important that we know if these gene set test methods are accurate. In this research, we compare the accuracy of a group of popular gene set test methods across a range of biologically realistic scenarios. In order to measure accuracy, we need to know whether each gene set is differentially expressed or not. Since this is not possible in real gene expression data, we use simulated data. We develop a simulation framework that generates gene expression data that is representative of actual gene expression data and use it to test each gene set method over a range of biologically relevant scenarios. We then compare the power and false discovery rate of each method across these scenarios.
37

Exact Analysis of Variance with Unequal Variances

Yanagi, Noriaki 01 May 1980 (has links)
The purpose of this paper was to present the exact analysis of variance with unequal variances. Bishop presented the new procedure for the r-way layout ANOVA. In this paper, one and two way layout ANOVA were explained and Bishop's method and Standard method were compared by using a Monte Carlo method.
38

Parameter Estimation in Nonstationary M/M/S Queueing Models

Vajanaphanich, Pensri 01 May 1982 (has links)
If either the arrival rate or the service rate in an M/M/S queue exhibit variability over time, then no steady state solution is available for examining the system behavior. The arrival and service rates can be represented through Fourier series approximations. This permits numerical approximation of the system characteristics over time. An example of an M/M/S representation of the operations of emergency treatment at Logan Regional hospital is presented. It requires numerical integration of the differential equation for L(t), the expected number of customers in the system at time t.
39

The Effectiveness of Categorical Variables in Discriminant Function Analysis

Waite, Preston Jay 01 May 1971 (has links)
A preliminary study of the feasibility of using categorical variables in discriminant function analysis was performed. Data including both continuous and categorical variables were used and predictive results examined. The discriminant function techniques were found to be robust enough to include the use of categorical variables. Some problems were encountered with using the trace criterion for selecting the most discriminating variables when these variables are categorical. No monotonic relationship was found to exist between the trace and the number of correct predictions. This study did show that the use of categorical variables does have much potential as a statistical tool in classification procedures. (50 pages)
40

Statistical Analysis for Tolerances of Noxious Weed Seeds

Dodge, Yadolah 01 May 1971 (has links)
An analysis of the previous method for testing tolerances of noxious weed seeds was performed. Problems of the current techniques were discussed, and the solution to these problems was given. A new technique of testing through the sequential test ratio was developed, and results examined. The sequential test was found to be useful enough to include the use of it in determining tolerances for noxious weed seeds. This study did show that the use of sequential tests does have excellent potential and flexibility as a statistical tool for the tolerances of noxious weed seeds. (75 pages)

Page generated in 0.0983 seconds