471 |
Multivariate morphometric analysis of seasonal changes in overwintering arctic charr (Salvelinus alpinus L.)Idrus, Muhammad Rijal. January 1996 (has links)
This study developed a robust technique for the assessment of morphometric differences among overwintering northern fish populations. Arctic charr were sampled soon before the freeze-up and just after ice break-up at two subarctic Quebec lakes. A homogenous sample of 397 fish was used. Regression analyses of the length-weight relationships and their derived condition indices were insufficient, due to their inherent limitations, to recognize the differences between sampling groups. A series of multivariate analyses (canonical, stepwise and discriminant analysis), based on eleven morphometric characters of the fish, provided a better assessment. The analysis recognized the distinctions between sampling groups, correctly classified 70-100% of the fish into their appropriate groupings, and indicated that body height measured at the anal opening was the most discriminatory variable. Landmark variables related to shape differences were effective in discriminating fish according to their lake of origin, whereas length and weight variables, which closely reflected the size differences, were better at distinguishing seasonal changes. The study provides a simple, efficient assessment method based on phenotypic variations to explain different survival strategies, and the associated life history traits, adopted by fish.
|
472 |
Interpretation of results from simplified principal componentsUddin, Mudassir January 1999 (has links)
Linear multivariate statistical methods are widely used for analysing data sets which consist of a large number of variables. These techniques, which include principal component analysis, factor analysis, canonical correlation analysis, redundancy analysis and discriminant analysis, all produce a set of new variables, commonly called 'factors', according to some criterion which differs for different techniques. Among these techniques, principal component analysis is one of the most popular techniques used for reducing the dimensions of the multivariate data set. In many applications, when Principal Component Analysis (PCA) is performed on a large number of variables, the interpretation of the results is not simple. The derived eigenvectors of the sample covariance or correlation matrix are not necessarily in a simple form, with all coefficients either 'large' or 'negligible'. To aid interpretation, it is fairly common practice to rotate the retained set of components, often using orthogonal rotation. The purpose of rotation is to simplify structure, and thus to make it easier to interpret the low-dimensional space represented by the retained set of components. Thus, quantification of simplicity is a two step process. The first set involves the extraction of the feature from the data called components, while the second stage uses a rotation method to simplify the structure. One of the two main purposes of this thesis is to combine into one step these two separate stages of dimension reduction (finding the components) and simplification (rotation). This goal is achieved by combining these two objectives in the form of a single function leading to what we call Simplified Components (SCs). Another objective is to discover which of the many possible criteria suggested in factor analysis can be adopted in the proposed procedure of SCs. Thus, a simplified one-step procedure of SCs is proposed, using four measures of simplicity, namely varimax, quartimax, orthomax and equamax indices.
|
473 |
Sufficiency criterion in statistical inferenceBookmyer, Lloyd D. January 1976 (has links)
In statistical inference one of the most important properties that an estimator of an unknown parameter can possess is the property of sufficiency. The use of sufficient statistics has been prescribed in the past and their use today plays an ever increasing role in modern statistical inference. Because of the utmost importance of sufficient statistics, it is the goal of this thesis to study the various aspects of sufficiency in connection with the estimation of parameters.This study utilizes a blend of a geometric and an analytic approach to sufficiency. This is done to show the power of each and to demonstrate how the two approaches complement each other and combine to give better insight into the concept of sufficiency.In this thesis the author also establishes some new and important results on sufficiency and minimal sufficiency. These results, taken as a whole, constitute a unified presentation of some of the most important aspects of sufficiency and non-sufficiency.
|
474 |
Predicting Femoral Geometry from Anatomical FeaturesGrondin Lazazzera, Jerome 30 April 2014 (has links)
Knee replacement surgery is a common orthopaedic procedure that greatly benefits from a three-dimensional geometric representation of a patient's knee bone obtained from MR or CT data. The use of these image modalities pose the following challenges: (i) high imaging cost; (ii) long wait times; (iii) limited availability and (iv) in the latter, large exposure to ionizing radiation. Traditional approaches based on planar X-ray radiography are significantly less prone to these issues; however, they only provide two-dimensional information.
This work presents a proof of concept study for generating patient-specific femoral bone shapes from a statistical shape atlas using anatomical features acquired from calibrated X-ray radiographs. Our hypothesis was: three-dimensional geometry can be reconstructed, within 2 millimeters RMS, by identifying features on two calibrated radiographs. We illustrate the feasibility of our approach with regards to acquiring features and the viability of reconstructing patient-specific bony anatomy. A set of reliable and relevant features is identified for which an acquisition protocol and user-interface was devised to minimize inter-observer variability. Both the data and methods used to construct the atlas are discussed as well generating shapes from features. The reconstructions accuracy was comparable to, albeit lower than, competing approaches that rely on two-dimensional bony contours. / Thesis (Master, Computing) -- Queen's University, 2014-04-29 21:53:10.809
|
475 |
Statistical Assessment of Peer-to-Peer Botnet FeaturesGodkin, Teghan 17 April 2013 (has links)
Botnets are collections of compromised machines which are controlled by a remotely located adversary. Botnets are of signi cant interest to cybersecurity researchers as they are a core mechanism that allows adversarial groups to gain control over large scale computing resources. Recent botnets have become increasingly complex, relying on Peer-to-Peer (P2P) protocols for botnet command and control (C&C). In this work, a packet-level simulation of a Kademlia-based P2P botnet is used in conjunction with a statistical analysis framework to investigate how measured botnet features change over time and across an ensemble of simulations. The simulation results include non-stationary and non-ergodic behaviours illustrating the complex nature of botnet operation and highlighting the need for rigorous statistical analysis as part of the engineering process. / Graduate / 0984, 0537, 0544
|
476 |
How large should a clinical trial be?Pezeshk, Hamid January 2000 (has links)
One of the most important questions in the planning of medical experiments to assess the performance of new drugs or treatments, is how big to make the trial. The problem, in its statistical formulation, is to determine the optimal size of a trial. The most frequently used methods of determining sample size in clinical trials is based on the required p-value, and the required power of the trial for a specified treatment effect. In contrast to the Bayesian decision theoretic approach there is no explicit balancing of the cost of a possible increase in the size of the trial against the benefit of the more accurate information which it would give. In this work we consider a fully Bayesian (or decision theoretic) approach to sample size determination in which the number of subsequent users of the therapy under investigation, and hence also the total benefit resulting from the trial, depend on the strength of the evidence provided by the trial. Our procedure differs from the usual Bayesian decision theory methodology, which assumes a single decision maker, by recognizing the existence of three decision makers, namely: the pharmaceutical company conducting the trial, which decides on its size; the regulator, whose approval is necessary for the drug to be licenced for sale; and the public at large, who determine the ultimate usage. Moreover, we model the subsequent usage by plausible assumptions for actual behaviour, rather than assuming that this represents decisions which are in some sense optimal. For this reason the procedure may be called "Behavioural Bayes" (or BeBay for short), the word Bayes referring to the optimization of the sample size. In the BeBay methodology the total expected benefit from carrying out the trial minus the cost of the trial is maximized. For any additional sales to occur as a result of the trial it must provide sufficient evidence both to convince the regulator to issue the necessary licence and to convince potential users that they should use the new treatment. The necessary evidence is in the form of a high probability after the trial that the new treatment achieves a clinically relevant improvement compared to the alternative treatment. The regulator is assumed to start from a more sceptical and less well-informed view of the likely performance of the treatment than the company carrying out the trial. The total benefit from a conclusively favourable trial is assessed on the basis of the size of the potential market and aggregated over the anticipated life-time of the product, using appropriate discounting for future years.
|
477 |
The development of a statistical computer software resource for medical researchBuchan, Iain Edward January 2000 (has links)
Medical research is often weakened by poor statistical practice, and inappropriate use of statistical computer software is part of this problem. The statistical knowledge that medical researchers require has traditionally been gained in both dedicated and ad hoc learning time, often separate from the research processes in which the statistical methods are applied. Computer software, however, can be written to flexibly support statistical practice. The work of this thesis was to explore the possibility of, and if possible, to create, a resource supporting medical researchers in statistical knowledge and calculation at the point of need. The work was carried out over eleven years, and was directed towards the medical research community in general. Statistical and Software Engineering methods were used to produce a unified statistical computational and knowledge support resource. Mathematically and computationally robust approaches to statistical methods were continually sought from current literature. The type of evaluation undertaken was formative; this included monitoring uptake of the software and feedback from its users, comparisons with other software, reviews in peer reviewed publications, and testing of results against classical and reference data. Large-scale opportunistic feedback from users of this resource was employed in its continuous improvement. The software resulting from the work of this thesis is provided herein as supportive evidence. Results of applying the software to classical reference data are shown in the written thesis. The scope and presentation of statistical methods are considered in a comparison of the software with common statistical software resources. This comparison showed that the software written for this thesis more closely matched statistical methods commonly used in medical research, and contained more statistical knowledge support materials. Up to October 31st 2000, uptake of the software was recorded for 5621 separate instances by individuals or institutions. The development has been self-sustaining. Medical researchers need to have sufficient statistical understanding, just as statistical researchers need to sufficiently understand the nature of data. Statistical software tools may damage statistical practice if they distract attention from statistical goals and tasks, onto the tools themselves. The work of this thesis provides a practical computing framework supporting statistical knowledge and calculation in medical research. This work has shown that sustainable software can be engineered to improve statistical appreciation and practice in ways that are beyond the reach of traditional medical statistical education.
|
478 |
Stagewise and stepwise regression techniques in meteorological forecastingHess, H. Allen January 1978 (has links)
No description available.
|
479 |
Patterns of performance : implications for the Rey auditory verbal learning testmarie@ca.com.au, Marie Hardman January 2001 (has links)
Three studies investigated patterns of performance as demonstrated by the serial position on the Rey Auditory Verbal Leaning Test (RAVLT). Patterns of performance were explored in a sample of genuine traumatic brain injured subjects who were litigating (TBI-LIT; N = 22) and compared to a sample of genuine traumatic brain injured subjects who were not in litigation (TBI-NONLIT; N = 22).
Comparisons were also made to a sample of subjects who were depressed but not neurologically compromised (PSY-DEP; N = 24). Results demonstrated that when time for loss of consciousness was controlled for, no difference existed between the litigating and non-litigating groups on any serial position. With this in mind the TBILIT and TBI NON-LIT groups were collapsed to form one traumatic brain injured group (TBI; N = 44). Patterns of performance were then compared between the TBI group, the PSY-DEP group and a normal control (NC; N = 68) group. No differences were demonstrated between the TBI and PSY-DEP groups on any serial position however, the NC group demonstrated significantly different primacy effects than the TBI group and significantly different recency effects than both the TBI and PSY-DEP groups (Study 1). Patterns of performance relative to the serial position were also compared in a group of Alzheimers Disease (AD; N=20) and dementia(DEM; N=20) subjects. Results indicated that the DEM group demonstrated a greater primacy effect than the AD group with both groups demonstrating a greater recency effect when compared to the primacy effect but no significantly so. Patterns of performance was also explored in a group of Huntington's Disease subjects (HD; =14) with this group demonstrating a significantly reduced primacy effect as compared to a recency effect (Study 2). In the third study patterns of performance were compared in a group of subjects having sustained fiontal lobe (FL; N=21) and
posterior lobe (PL; N=21) lesions to the brain. Subjects with PL lesions demonstrated a signifckntly greater primacy effect as compared to the FL group with both groups demonstrating a reduced recency effect. Comparisons were also made between the PL and FL groups with normal control groups (FL-NC; N = 21: PL-NC; N = 21) and results indicated that the FL group demonstrated a significantly reduced primacy and recency effect when compared to the normal control group. When comparisons were made between the PL and a normal control group, the PL group demonstrated a significantly reduced recency effect as compared to normal controls. Pattern of performance were also explored in a small sample of subjects with diffuse
DIFF; N=6) damage to the brain and results demonstrated that this group displayed a reduced recency effect as compared to the primacy effect (Study 3). Overall, when examining the serial position effects across all experimental groups, subjects who had sustained a traumatic injury to the brain or who were depressed all demonstrated a greater primacy effect as compared to the recency effect by recalling more words on that position. This contrasted the pattern of performance which emerged with various dementing processes where more words were recalled in the recency position as compared to the primacy position. Results for all studies were analyzed using MANOVA followed by the Sheffe procedure.
|
480 |
Phase structure and phase transitions in semicrystalline isotactic polystyrene /Xu, Hui. January 2005 (has links)
Thesis (Ph.D.)--Tufts University, 2005. / Adviser: Peggy Cebe. Submitted to the Dept. of Physics. Includes bibliographical references. Access restricted to members of the Tufts University community. Also available via the World Wide Web;
|
Page generated in 0.0716 seconds