• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 19
  • 19
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The use of small scale fire test data for the hazard assessment of bulk materials

Foley, Marianne January 1995 (has links)
An experimental study of fire testing of solid materials has been carried out to investigate whether or not these tests yield useful data for the burning of materials stored in bulk, for example in warehouses. Tests were performed using the Cone Calorimeter, the HSE third scale room/corridor rig, BS 5852 part 2, and some nonstandard tests. The results have been compared and the problems with fire testing have been discussed with reference to the current literature and trends in fire testing. The additional complications of unusual material behaviour under exposure to heating have also been investigated. In the third scale room/corridor test, where vertical, parallel samples are used, the separation distance between the samples was found to play a significant part in whether ignition of fire retarded samples could be achieved or not. A literature survey revealed a dearth of information on this subject. As this type of parallel configuration is found in warehouse storage as well as vertical ducts and cavities, an investigation was conducted into flames between vertical parallel walls. Measurements were made of total and radiative heat fluxes at the walls, flame and gas temperatures, and flame heights under a variety of conditions. It was found that the configuration of the system was very important, with the separation distance and fluid dynamics both having a major influence. Burner position, geometry and heat release rate were also varied and their influence assessed. Statistical methods were employed to correlate the heat flux data and temperatures with the other variables, with excellent correlation coefficients for the equations developed. These have been compared with previous expressions developed for flames against vertical walls. Results from CFD work on two of the parallel wall cases of special interest were analysed and discussed with reference to the . experimental results. The findings have implications for the fire testing of materials, and for the hazard assessment of materials stored in high rack storage. An understanding of potential exposure conditions in a real fire scenario are essential for the appropriate use of fire tests.
2

A comparison of the power of the Wilcoxon test to that of the t-test under Lehmann's alternatives

Hwang, Chern-Hwang January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
3

On Some Test Statistics for Testing the Population Skewness and Kurtosis: An Empirical Study

Guo, Yawen 26 August 2016 (has links)
The purpose of this thesis is to propose some test statistics for testing the skewness and kurtosis parameters of a distribution, not limited to a normal distribution. Since a theoretical comparison is not possible, a simulation study has been conducted to compare the performance of the test statistics. We have compared both parametric methods (classical method with normality assumption) and non-parametric methods (bootstrap in Bias Corrected Standard Method, Efron’s Percentile Method, Hall’s Percentile Method and Bias Corrected Percentile Method). Our simulation results for testing the skewness parameter indicate that the power of the tests differs significantly across sample sizes, the choice of alternative hypotheses and methods we chose. For testing the kurtosis parameter, the simulation results suggested that the classical method performs well when the data are from both normal and beta distributions and bootstrap methods are useful for uniform distribution especially when the sample size is large.
4

Thresholding FMRI images

Pavlicova, Martina January 2004 (has links)
No description available.
5

Search for Dijet Resonances in sqrt(s)=7 TeV Proton-Proton Collisions with the ATLAS Detector at the LHC

Cheung, Sing Leung 05 January 2012 (has links)
A search for new heavy resonances in two-jet final states is described in this thesis. The data were collected by the ATLAS detector proton-proton collisions at sqrt(s) = 7 TeV and correspond to a time-integrated luminosity of 6.1 pb−1. The background-only hypothesis was tested on the observed data using BumpHunter test statistic. Consistency was found between the observed data and the background-only prediction. No resonant features were observed. A Bayesian approach using binned maximum likelihood was used to set upper limits on the product of cross section and detector acceptance for excited-quark (q*) production as a function of q* mass. At 95% credibility level (CL), the q* mass in the interval of 0.50 TeV < mq* < 1.62 TeV is excluded, extending the reach of previous experiments.
6

Search for Dijet Resonances in sqrt(s)=7 TeV Proton-Proton Collisions with the ATLAS Detector at the LHC

Cheung, Sing Leung 05 January 2012 (has links)
A search for new heavy resonances in two-jet final states is described in this thesis. The data were collected by the ATLAS detector proton-proton collisions at sqrt(s) = 7 TeV and correspond to a time-integrated luminosity of 6.1 pb−1. The background-only hypothesis was tested on the observed data using BumpHunter test statistic. Consistency was found between the observed data and the background-only prediction. No resonant features were observed. A Bayesian approach using binned maximum likelihood was used to set upper limits on the product of cross section and detector acceptance for excited-quark (q*) production as a function of q* mass. At 95% credibility level (CL), the q* mass in the interval of 0.50 TeV < mq* < 1.62 TeV is excluded, extending the reach of previous experiments.
7

Corrected LM goodness-of-fit tests with applicaton to stock returns

Percy, Edward Richard, January 2005 (has links)
Thesis (Ph. D.)--Ohio State University, 2005. / Title from first page of PDF file. Includes bibliographical references (p. 263-266).
8

Healthcare providers' experience of chronic grief in a pediatric subacute facility

Sacks, William Andrew 01 January 2001 (has links)
The purpose of this study was: (1) to evaluate the level of grief experienced by healthcare providers in a pediatric subacute facility, (2) to compare the levels of grief between different groups of healthcare providers (Certified Nurses' Aides, Licensed Nurses, and Respiratory Care Practitioners), and (3) to describe the personality/demographic factors that influence a healthcare provider's ability to cope effectively with compound grief.
9

The Comparative Effects of Varying Cell Sizes on Mcnemar's Test with the Χ^2 Test of Independence and T Test for Related Samples

Black, Kenneth U. 08 1900 (has links)
This study compared the results for McNemar's test, the t test for related measures, and the chi-square test of independence as cell sized varied in a two-by-two frequency table. In this study. the probability results for McNemar's rest, the t test for related measures, and the chi-square test of independence were compared for 13,310 different combinations of cell sizes in a two-by-two design. Several conclusions were reached: With very few exceptions, the t test for related measures and McNemar's test yielded probability results within .002 of each other. The chi-square test seemed to equal the other two tests consistently only when low probabilities less than or equal to .001 were attained. It is recommended that the researcher consider using the t test for related measures as a viable option for McNemar's test except when the researcher is certain he/she is only interested in 'changes'. The chi-square test of independence not only tests a different hypothesis than McNemar's test, but it often yields greatly differing results from McNemar's test.
10

Bayesian Model Selection for High-dimensional High-throughput Data

Joshi, Adarsh 2010 May 1900 (has links)
Bayesian methods are often criticized on the grounds of subjectivity. Furthermore, misspecified priors can have a deleterious effect on Bayesian inference. Noting that model selection is effectively a test of many hypotheses, Dr. Valen E. Johnson sought to eliminate the need of prior specification by computing Bayes' factors from frequentist test statistics. In his pioneering work that was published in the year 2005, Dr. Johnson proposed using so-called local priors for computing Bayes? factors from test statistics. Dr. Johnson and Dr. Jianhua Hu used Bayes' factors for model selection in a linear model setting. In an independent work, Dr. Johnson and another colleage, David Rossell, investigated two families of non-local priors for testing the regression parameter in a linear model setting. These non-local priors enable greater separation between the theories of null and alternative hypotheses. In this dissertation, I extend model selection based on Bayes' factors and use nonlocal priors to define Bayes' factors based on test statistics. With these priors, I have been able to reduce the problem of prior specification to setting to just one scaling parameter. That scaling parameter can be easily set, for example, on the basis of frequentist operating characteristics of the corresponding Bayes' factors. Furthermore, the loss of information by basing a Bayes' factors on a test statistic is minimal. Along with Dr. Johnson and Dr. Hu, I used the Bayes' factors based on the likelihood ratio statistic to develop a method for clustering gene expression data. This method has performed well in both simulated examples and real datasets. An outline of that work is also included in this dissertation. Further, I extend the clustering model to a subclass of the decomposable graphical model class, which is more appropriate for genotype data sets, such as single-nucleotide polymorphism (SNP) data. Efficient FORTRAN programming has enabled me to apply the methodology to hundreds of nodes. For problems that produce computationally harder probability landscapes, I propose a modification of the Markov chain Monte Carlo algorithm to extract information regarding the important network structures in the data. This modified algorithm performs well in inferring complex network structures. I use this method to develop a prediction model for disease based on SNP data. My method performs well in cross-validation studies.

Page generated in 0.092 seconds