Spelling suggestions: "subject:"quadratic discriminant 2analysis"" "subject:"quadratic discriminant 3analysis""
1 |
Analysis Of Sinusoidal And Helical Buckling Of Drill String In Horizontal Wells Using Finite Element MethodArpaci, Erdogan 01 August 2009 (has links) (PDF)
The number of horizontal wells is increasing rapidly in all over the world with the growth of new technological developments. During horizontal well drilling, much more complex problems occur when compared with vertical well drilling, such as decrease in load transfer to the bit, tubular failure, tubular fatigue and tubular lock-up. This makes selection of appropriate tubular and making the right drill string design more important. As the total compression load on the horizontal section increases, the behavior of the tubular changes from straight to sinusoidal buckling, and if the total compression load continues to increase the behavior of the tubular changes to helical buckling. Determination of critical buckling loads with finite element method (FEM) in horizontal wells is the main objective of this study. Initially, a computer program (ANSYS) that uses FEM is employed to simulate different tubular and well conditions. Four different pipe sizes, four different wellbore sizes and three different torque values are used to model the cases. Critical buckling load values corresponding to significant variables are collected from these simulated cases. The results are classified into different buckling modes according to the applied weight on bit values and the main properties of the simulated model, such as modulus of elasticity, moment of inertia of tubular cross section, weight per unit length of tubular and radial clearance between the wellbore and the tubular. Then, the boundary equations between the buckling modes are obtained. The equations developed in this thesis by simulating the cases for the specific tubular sizes are used to make a comparison between the critical buckling load values from the models in the literature and this work. It is observed that the results of this work fit with literature models as the tubular size increases. The influence of torque on critical buckling load values is investigated. It is observed that torque has a slight effect on critical buckling load values. Also the applicability of ANSYS for buckling problems was revealed by comparing the ANSYS results with the literature models& / #8217 / results and the experimental study in the literature.
|
2 |
Robustness Against Non-Normality : Evaluating LDA and QDA in Simulated Settings Using Multivariate Non-Normal DistributionsViktor, Gånheim, Isak, Åslund January 2023 (has links)
Evaluating classifiers in controlled settings is essential for empirical applications, as extensive knowledge on model-behaviour is needed for accurate predictions. This thesis investigates robustness against non-normality of two prominent classifiers, LDA and QDA. Through simulation, errors in leave-one-out cross-validation are compared for data generated by different multivariate distributions, also controlling for covariance structures, class separation and sample sizes. Unexpectedly, the classifiers perform better on data generated by heavy-tailed symmetrical distributions than by the normal distribution. Possible explanations are proposed, but the cause remains unknown. There is need for further studies, investigating more settings as well as mathematical properties to verify and understand these results.
|
3 |
Comparison of Discrimination between Logistic Model with Distance Indicator and Regularized Function for Cardiology Ultrasound in Left VentricleKao, Li-wen 08 July 2011 (has links)
Most of the cardiac structural abnormalities will be examined by echocardiography. With more understanding of heart diseases, it is commonly recognized that heart failures are closely related to left ventricular systolic and diastolic functions. This work discusses the association between gray-scale differences and the risk of heart disease from the changes in left ventricular systole and diastole of ultrasound image. Owing to the large dimension
of data matrix, following Chen (2011), we also simplify the influence factors by factor analysis and calculate factor scores to present the characteristics of subjects.
Two kinds of classification criteria are used in this work, namely logistic model with distance indicator and discriminant function. According to Guo et al. (2001), we calculate the Mahalanobis distance from each subject to the center of normal and abnormal group, then use logistic model to fit the distances for classification later. This is called logistic model with distance indicator. For the discriminant analysis, the regularized method by Friedman (1989) for estimation of covariance matrix is used, which is more flexible and can improve the covariance matrix estimates when the sample size is small. As far as the
cut-point of ROC curve, following the approach as in Hanley et al. (1982), we find the most appropriate cut-point which has good performances for both sensitivity and specificity under the same classification criteria. Then the regularized method and the cut-point of ROC curve are combined to be a new classification criterion. The results under the new
classification criterion are presented to classify normal and abnormal groups.
|
4 |
Classification of Carpiodes Using Fourier Descriptors: A Content Based Image Retrieval ApproachTrahan, Patrick 06 August 2009 (has links)
Taxonomic classification has always been important to the study of any biological system. Many biological species will go unclassified and become lost forever at the current rate of classification. The current state of computer technology makes image storage and retrieval possible on a global level. As a result, computer-aided taxonomy is now possible. Content based image retrieval techniques utilize visual features of the image for classification. By utilizing image content and computer technology, the gap between taxonomic classification and species destruction is shrinking. This content based study utilizes the Fourier Descriptors of fifteen known landmark features on three Carpiodes species: C.carpio, C.velifer, and C.cyprinus. Classification analysis involves both unsupervised and supervised machine learning algorithms. Fourier Descriptors of the fifteen known landmarks provide for strong classification power on image data. Feature reduction analysis indicates feature reduction is possible. This proves useful for increasing generalization power of classification.
|
5 |
A Study of Several Statistical Methods for Classification with Application to Microbial Source TrackingZhong, Xiao 30 April 2004 (has links)
With the advent of computers and the information age, vast amounts of data generated in a great deal of science and industry fields require the statisticians to explore further. In particular, statistical and computational problems in biology and medicine have created a new field of bioinformatics, which is attracting more and more statisticians, computer scientists, and biologists. Several procedures have been developed for tracing the source of fecal pollution in water resources based on certain characteristics of certain microorganisms. Use of this collection of techniques has been termed microbial source tracking (MST). Most of the current methods for MST are based on patterns of either phenotypic or genotypic variation in indicator organisms. Studies also suggested that patterns of genotypic variation might be more reliable due to their less association with environmental factors than those of phenotypic variation. Among the genotypic methods for source tracking, fingerprinting via rep-PCR is most common. Thus, identifying the specific pollution sources in contaminated waters based on rep-PCR fingerprinting techniques, viewed as a classification problem, has become an increasingly popular research topic in bioinformatics. In the project, several statistical methods for classification were studied, including linear discriminant analysis, quadratic discriminant analysis, logistic regression, and $k$-nearest-neighbor rules, neural networks and support vector machine. This project report summaries each of these methods and relevant statistical theory. In addition, an application of these methods to a particular set of MST data is presented and comparisons are made.
|
6 |
Chemical Analysis, Databasing, and Statistical Analysis of Smokeless Powders for Forensic ApplicationDennis, Dana-Marie 01 January 2015 (has links)
Smokeless powders are a set of energetic materials, known as low explosives, which are typically utilized for reloading ammunition. There are three types which differ in their primary energetic materials; where single base powders contain nitrocellulose as their primary energetic material, double and triple base powders contain nitroglycerin in addition to nitrocellulose, and triple base powders also contain nitroguanidine. Additional organic compounds, while not proprietary to specific manufacturers, are added to the powders in varied ratios during the manufacturing process to optimize the ballistic performance of the powders. The additional compounds function as stabilizers, plasticizers, flash suppressants, deterrents, and opacifiers. Of the three smokeless powder types, single and double base powders are commercially available, and have been heavily utilized in the manufacture of improvised explosive devices. Forensic smokeless powder samples are currently analyzed using multiple analytical techniques. Combined microscopic, macroscopic, and instrumental techniques are used to evaluate the sample, and the information obtained is used to generate a list of potential distributors. Gas chromatography – mass spectrometry (GC-MS) is arguably the most useful of the instrumental techniques since it distinguishes single and double base powders, and provides additional information about the relative ratios of all the analytes present in the sample. However, forensic smokeless powder samples are still limited to being classified as either single or double base powders, based on the absence or presence of nitroglycerin, respectively. In this work, the goal was to develop statistically valid classes, beyond the single and double base designations, based on multiple organic compounds which are commonly encountered in commercial smokeless powders. Several chemometric techniques were applied to smokeless powder GC-MS data for determination of the classes, and for assignment of test samples to these novel classes. The total ion spectrum (TIS), which is calculated from the GC-MS data for each sample, is obtained by summing the intensities for each mass-to-charge (m/z) ratio across the entire chromatographic profile. A TIS matrix comprising data for 726 smokeless powder samples was subject to agglomerative hierarchical cluster (AHC) analysis, and six distinct classes were identified. Within each class, a single m/z ratio had the highest intensity for the majority of samples, though the m/z ratio was not always unique to the specific class. Based on these observations, a new classification method known as the Intense Ion Rule (IIR) was developed and used for the assignment of test samples to the AHC designated classes. Discriminant models were developed for assignment of test samples to the AHC designated classes using k-Nearest Neighbors (kNN) and linear and quadratic discriminant analyses (LDA and QDA, respectively). Each of the models were optimized using leave-one-out (LOO) and leave-group-out (LGO) cross-validation, and the performance of the models was evaluated by calculating correct classification rates for assignment of the cross-validation (CV) samples to the AHC designated classes. The optimized models were utilized to assign test samples to the AHC designated classes. Overall, the QDA LGO model achieved the highest correct classification rates for assignment of both the CV samples and the test samples to the AHC designated classes. In forensic application, the goal of an explosives analyst is to ascertain the manufacturer of a smokeless powder sample. In addition, knowledge about the probability of a forensic sample being produced by a specific manufacturer could potentially decrease the time invested by an analyst during investigation by providing a shorter list of potential manufacturers. In this work, Bayes* Theorem and Bayesian Networks were investigated as an additional tool to be utilized in forensic casework. Bayesian Networks were generated and used to calculate posterior probabilities of a test sample belonging to specific manufacturers. The networks were designed to include manufacturer controlled powder characteristics such as shape, color, and dimension; as well as, the relative intensities of the class associated ions determined from cluster analysis. Samples were predicted to belong to a manufacturer based on the highest posterior probability. Overall percent correct rates were determined by calculating the percentage of correct predictions; that is, where the known and predicted manufacturer were the same. The initial overall percent correct rate was 66%. The dimensions of the smokeless powders were added to the network as average diameter and average length nodes. Addition of average diameter and length resulted in an overall prediction rate of 70%.
|
Page generated in 0.091 seconds