901 |
Application of fracture mechanics in analyzing delamination of cyclically loaded paperboard coreIlomäki, M. (Marko) 27 August 2004 (has links)
Abstract
The primary objective of this work is to study and model the
fracture process and durability of paperboard cores in cyclic loading.
The results are utilized in creating analytic model to estimate the life
time of cores in printing industry. The life time means here the maximum
number of winding-unwinding cycles before the core delaminates. This
study serves also as an example of use of board as a constructional
engineering material.
Board is an example of complicated, fibrous, porous, hydroscopic,
time dependent and statistic material. Different core board grades are
typically made of recycled fibers. The material model in this work is
linear-elastic, homogeneous and orthotropic.
The material characteristics, elastic and strength properties are
studied first. Then the material is studied from the points of view of
fracture and fatigue mechanics. Some of the analysis and test methods
are originally developed for fiber composites but have been applied
successfully here also for laminated board specimen. An interesting
finding is that Scott Bond correlates well with the sum of mode I and
mode II critical strain energy release rates. It was also possible to
apply Paris' law and Miner's cumulative damage theory in the studied
example situations.
The creation of the life time model starts by FEM-analysis of
cracked and non cracked cores in a typical loading situation. The
elastic-linear material model is used here. The calculated stresses are
utilized in analytic J-integral model. The agreement between analytic
and numerical J-integral estimations is good.
The analytic life time model utilizes the analytic J-integral
model, Miner's cumulative damage theory and analytically formulated
Wöhler-curves which were constructed by applying the Paris' law.
The
Wöhler-curves were constructed also by testing cores to validate the
theoretical results. The testing conditions are validated by
FEM-analysis.
The cores heat up when tested or used with non expanding chucks
and a temperature correction was needed in the life time model to
consider this. Also, single or multi crack model was used depending on
the studied case. The calculated and tested durability prediction curves
show good correspondence. The results are finally reduced to correspond
to certain confidence level.
|
902 |
Statistical considerations in the design and analysis of cross-over trialsMorrey, Gilbert Heneage January 1991 (has links)
No description available.
|
903 |
Statistical histogram characterization and modeling : theory and applicationsChoy, Siu Kai 01 January 2008 (has links)
No description available.
|
904 |
Complete spatial randomness tests, intensity-dependent marking and neighbourhood competition of spatial point processes with applications to ecologyHo, Lai Ping 01 January 2006 (has links)
No description available.
|
905 |
An application of factor analysis on a 24-item scale on the attitudes towards AIDS precautions using Pearson, Spearman and Polychoric correlation matricesAbdalmajid, Mohammed Babekir Elmalik January 2006 (has links)
Magister Scientiae - MSc / The 24-item scale has been used extensively to assess the attitudes towards AIDS precautions. This study investigated the usefulness and validity of the instrument in a South African setting, fourteen years after the development of the instrument. If a new structure could be found statistically, the HIV/AIDS prevention strategies could be more effective in aiding campaigns to change attitudes and sexual behaviour. / South Africa
|
906 |
Univariate parametric and nonparametric statistical quality control techniques with estimated process parametersHuman, Schalk William 17 October 2009 (has links)
Chapter 1 gives a brief introduction to statistical quality control (SQC) and provides background information regarding the research conducted in this thesis. We begin Chapter 2 with the design of Shewhart-type Phase I S2, S and R control charts for the situation when the mean and the variance are both unknown and are estimated on the basis of m independent rational subgroups each of size n available from a normally distributed process. The derivations recognize that in Phase I (with unknown parameters) the signaling events are dependent and that more than one comparison is made against the same estimated limits simultaneously; this leads to working with the joint distribution of a set of dependent random variables. Using intensive computer simulations, tables are provided with the charting constants for each chart for a given false alarm probability. Second an overview of the literature on Phase I parametric control charts for univariate variables data is given assuming that the form of the underlying continuous distribution is known. The overview presents the current state of the art and what challenges still remain. It is pointed out that, because the Phase I signaling events are dependent and multiple signaling events are to be dealt with simultaneously (in making an in-control or not-in-control decision), the joint distribution of the charting statistics needs to be used and the recommendation is to control the probability of at least one false alarm while setting up the charts. In Chapter 3 we derive and evaluate expressions for the run-length distributions of the Phase II Shewhart-type p-chart and the Phase II Shewhart-type c-chart when the parameters are estimated. We then examine the effect of estimating and on the performance of the p-chart and the c-chart via their run-length distributions and associated characteristics such as the average run-length, the false alarm rate and the probability of a “no-signal”. An exact approach based on the binomial and the Poisson distributions is used to derive expressions for the Phase II run-length distributions and the related Phase II characteristics using expectation by conditioning (see e.g. Chakraborti, (2000)). We first obtain the characteristics of the run-length distributions conditioned on point estimates from Phase I and then find the unconditional characteristics by averaging over the distributions of the point estimators. The in-control and the out-of-control properties of the charts are looked at. The results are used to discuss the appropriateness of the widely followed empirical rules for choosing the size of the Phase I sample used to estimate the unknown parameters; this includes the number of reference samples m and the sample size n. Chapter 4 focuses on distribution-free control charts and considers a new class of nonparametric charts with runs-type signaling rules (i.e. runs of the charting statistics above and below the control limits) for both the scenarios where the percentile of interest of the distribution is known and unknown. In the former situation (or Case K) the charts are based on the sign test statistic and enhance the sign chart proposed by Amin et al. (1995); in the latter scenario (or Case U) the charts are based on the two-sample median test statistic and improve the precedence charts by Chakraborti et al. (2004). A Markov chain approach (see e.g. Fu and Lou, (2003)) is used to derive the run-length distributions, the average run-lengths, the standard deviation of the run-lengths etc. for our runs rule enhanced charts. In some cases, we also draw on the results of the geometric distribution of order k (see e.g. Chapter 2 of Balakrishnan and Koutras, (2002)) to obtain closed form and explicit expressions for the run-length distributions and/or their associated performance characteristics. Tables are provided for implementation of the charts and examples are given to illustrate the application and usefulness of the charts. The in-control and the out-of-control performance of the charts are studied and compared to the existing nonparametric charts using criteria such as the average run-length, the standard deviation of the run-length, the false alarm rate and some percentiles of the run-length, including the median run-length. It is shown that the proposed “runs rules enhanced” sign charts offer more practically desirable in-control average run-lengths and false alarm rates and perform better for some distributions. Chapter 5 wraps up this thesis with a summary of the research carried out and offers concluding remarks concerning unanswered questions and/or future research opportunities. / Thesis (PhD)--University of Pretoria, 2009. / Mathematics and Applied Mathematics / unrestricted
|
907 |
Model selection for cointegrated relationships in small samplesHe, Wei January 2008 (has links)
Vector autoregression models have become widely used research tools in the analysis of macroeconomic time series. Cointegrated techniques are an essential part of empirical macroeconomic research. They infer causal long-run relationships between nonstationary variables. In this study, six information criteria were reviewed and compared. The methods focused on determining the optimum information criteria for detecting the correct lag structure of a two-variable cointegrated process.
|
908 |
Tolerance intervals for variance component models using a Bayesian simulation procedureSarpong, Abeam Danso January 2013 (has links)
The estimation of variance components serves as an integral part of the evaluation of variation, and is of interest and required in a variety of applications (Hugo, 2012). Estimation of the among-group variance components is often desired for quantifying the variability and effectively understanding these measurements (Van Der Rijst, 2006). The methodology for determining Bayesian tolerance intervals for the one – way random effects model has originally been proposed by Wolfinger (1998) using both informative and non-informative prior distributions (Hugo, 2012). Wolfinger (1998) also provided relationships with frequentist methodologies. From a Bayesian point of view, it is important to investigate and compare the effect on coverage probabilities if negative variance components are either replaced by zero, or completely disregarded from the simulation process. This research presents a simulation-based approach for determining Bayesian tolerance intervals in variance component models when negative variance components are either replaced by zero, or completely disregarded from the simulation process. This approach handles different kinds of tolerance intervals in a straightforward fashion. It makes use of a computer-generated sample (Monte Carlo process) from the joint posterior distribution of the mean and variance parameters to construct a sample from other relevant posterior distributions. This research makes use of only non-informative Jeffreys‟ prior distributions and uses three Bayesian simulation methods. Comparative results of different tolerance intervals obtained using a method where negative variance components are either replaced by zero or completely disregarded from the simulation process, is investigated and discussed in this research.
|
909 |
Randomization in a two armed clinical trial: an overview of different randomization techniquesBatidzirai, Jesca Mercy January 2011 (has links)
Randomization is the key element of any sensible clinical trial. It is the only way we can be sure that the patients have been allocated into the treatment groups without bias and that the treatment groups are almost similar before the start of the trial. The randomization schemes used to allocate patients into the treatment groups play a role in achieving this goal. This study uses SAS simulations to do categorical data analysis and comparison of differences between two main randomization schemes namely unrestricted and restricted randomization in dental studies where there are small samples, i.e. simple randomization and the minimization method respectively. Results show that minimization produces almost equally sized treatment groups, but simple randomization is weak in balancing prognostic factors. Nevertheless, simple randomization can also produce balanced groups even in small samples, by chance. Statistical power is also improved when minimization is used than in simple randomization, but bigger samples might be needed to boost the power.
|
910 |
Estimation of measurement uncertainty in the sampling of contaminated landArgyraki, Ariadni January 1997 (has links)
No description available.
|
Page generated in 0.103 seconds