• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 234
  • 72
  • 28
  • 28
  • 17
  • 9
  • 9
  • 9
  • 6
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 481
  • 481
  • 481
  • 157
  • 133
  • 113
  • 108
  • 80
  • 77
  • 72
  • 71
  • 65
  • 61
  • 55
  • 52
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Predicting mutation score using source code and test suite metrics

Jalbert, Kevin 01 September 2012 (has links)
Mutation testing has traditionally been used to evaluate the effectiveness of test suites and provide con dence in the testing process. Mutation testing involves the creation of many versions of a program each with a single syntactic fault. A test suite is evaluated against these program versions (i.e., mutants) in order to determine the percentage of mutants a test suite is able to identify (i.e., mutation score). A major drawback of mutation testing is that even a small program may yield thousands of mutants and can potentially make the process cost prohibitive. To improve the performance and reduce the cost of mutation testing, we proposed a machine learning approach to predict mutation score based on a combination of source code and test suite metrics. We conducted an empirical evaluation of our approach to evaluated its effectiveness using eight open source software systems. / UOIT
62

Protein Secondary Structure Prediction Using Support Vector Machines, Nueral Networks and Genetic Algorithms

Reyaz-Ahmed, Anjum B 03 May 2007 (has links)
Bioinformatics techniques to protein secondary structure prediction mostly depend on the information available in amino acid sequence. Support vector machines (SVM) have shown strong generalization ability in a number of application areas, including protein structure prediction. In this study, a new sliding window scheme is introduced with multiple windows to form the protein data for training and testing SVM. Orthogonal encoding scheme coupled with BLOSUM62 matrix is used to make the prediction. First the prediction of binary classifiers using multiple windows is compared with single window scheme, the results shows single window not to be good in all cases. Two new classifiers are introduced for effective tertiary classification. This new classifiers use neural networks and genetic algorithms to optimize the accuracy of the tertiary classifier. The accuracy level of the new architectures are determined and compared with other studies. The tertiary architecture is better than most available techniques.
63

Classification of Genotype and Age of Eyes Using RPE Cell Size and Shape

Yu, Jie 18 December 2012 (has links)
Retinal pigment epithelium (RPE) is a principal site of pathogenesis in age-related macular de-generation (AMD). AMD is a main source of vision loss even blindness in the elderly and there is no effective treatment right now. Our aim is to describe the relationship between the morphology of RPE cells and the age and genotype of the eyes. We use principal component analysis (PCA) or functional principal component method (FPCA), support vector machine (SVM), and random forest (RF) methods to analyze the morphological data of RPE cells in mouse eyes to classify their age and genotype. Our analyses show that amongst all morphometric measures of RPE cells, cell shape measurements (eccentricity and solidity) are good for classification. But combination of cell shape and size (perimeter) provide best classification.
64

Spam filter for SMS-traffic

Fredborg, Johan January 2013 (has links)
Communication through text messaging, SMS (Short Message Service), is nowadays a huge industry with billions of active users. Because of the huge userbase it has attracted many companies trying to market themselves through unsolicited messages in this medium in the same way as was previously done through email. This is such a common phenomenon that SMS spam has now become a plague in many countries. This report evaluates several established machine learning algorithms to see how well they can be applied to the problem of filtering unsolicited SMS messages. Each filter is mainly evaluated by analyzing the accuracy of the filters on stored message data. The report also discusses and compares requirements for hardware versus performance measured by how many messages that can be evaluated in a fixed amount of time. The results from the evaluation shows that a decision tree filter is the best choice of the filters evaluated. It has the highest accuracy as well as a high enough process rate of messages to be applicable. The decision tree filter which was found to be the most suitable for the task in this environment has been implemented. The accuracy in this new implementation is shown to be as high as the implementation used for the evaluation of this filter. Though the decision tree filter is shown to be the best choice of the filters evaluated it turned out the accuracy is not high enough to meet the specified requirements. It however shows promising results for further testing in this area by using improved methods on the best performing algorithms.
65

Detecting Land Cover Change over a 20 Year Time Period in the Niagara Escarpment Plan Using Satellite Remote Sensing

Waite, Holly January 2009 (has links)
The Niagara Escarpment is one of Southern Ontario’s most important landscapes. Due to the nature of the landform and its location, the Escarpment is subject to various development pressures including urban expansion, mineral resource extraction, agricultural practices and recreation. In 1985, Canada’s first large scale environmentally based land use plan was put in place to ensure that only development that is compatible with the Escarpment occurred within the Niagara Escarpment Plan (NEP). The southern extent of the NEP is of particular interest in this study, since a portion of the Plan is located within the rapidly expanding Greater Toronto Area (GTA). The Plan area located in the Regional Municipalities of Hamilton and Halton represent both urban and rural geographical areas respectively, and are both experiencing development pressures and subsequent changes in land cover. Monitoring initiatives on the NEP have been established, but have done little to identify consistent techniques for monitoring land cover on the Niagara Escarpment. Land cover information is an important part of planning and environmental monitoring initiatives. Remote sensing has the potential to provide frequent and accurate land cover information over various spatial scales. The goal of this research was to examine land cover change in the Regional Municipalities of Hamilton and Halton portions of the NEP. This was achieved through the creation of land cover maps for each region using Landsat 5 Thematic Mapper (TM) remotely sensed data. These maps aided in determining the qualitative and quantitative changes that had occurred in the Plan area over a 20 year time period from 1986 to 2006. Change was also examined based on the NEP’s land use designations, to determine if the Plan policy has been effective in protecting the Escarpment. To obtain land cover maps, five different supervised classification methods were explored: Minimum Distance, Mahalanobis Distance, Maximum Likelihood, Object-oriented and Support Vector Machine. Seven land cover classes were mapped (forest, water, recreation, bare agricultural fields, vegetated agricultural fields, urban and mineral resource extraction areas) at a regional scale. SVM proved most successful at mapping land cover on the Escarpment, providing classification maps with an average accuracy of 86.7%. Land cover change analysis showed promising results with an increase in the forested class and only slight increases to the urban and mineral resource extraction classes. Negatively, there was a decrease in agricultural land overall. An examination of land cover change based on the NEP land use designations showed little change, other than change that is regulated under Plan policies, proving the success of the NEP for protecting vital Escarpment lands insofar as this can be revealed through remote sensing. Land cover should be monitored in the NEP consistently over time to ensure changes in the Plan area are compatible with the Niagara Escarpment. Remote sensing is a tool that can provide this information to the Niagara Escarpment Commission (NEC) in a timely, comprehensive and cost-effective way. The information gained from remotely sensed data can aid in environmental monitoring and policy planning into the future.
66

Large Scale Terrain Modelling for Autonomous Mining

Norberg, Johan January 2010 (has links)
This thesis is concerned with development of a terrain model using Gaussian Processes to support the automation of open-pit mines. Information can be provided from a variety of sources including GPS, laser scans and manual surveys. The information is then fused together into a single representation of the terrain together with a measure of uncertainty of the estimated model. The model is also used to detect and label specific features in the terrain. In the context of mining, theses features are edges known as toes and crests. A combination of clustering and classification using supervised learning detects and labels these regions. Data gathered from production iron ore mines in Western Australia and a farm in Marulan outside Sydney is used to demonstrate and verify the ability of Gaussian Processes to estimate a model of the terrain. The estimated terrain model is then used for detecting features of interest.Results show that the Gaussian Process correctly estimates the terrain and uncertainties, and provide a good representation of the area. Toes and crests are also successfully identified and labelled.
67

Machine vision for automating visual inspectionof wooden railway sleepers

Sajjad Pasha, Mohammad January 2007 (has links)
No description available.
68

Detecting Land Cover Change over a 20 Year Time Period in the Niagara Escarpment Plan Using Satellite Remote Sensing

Waite, Holly January 2009 (has links)
The Niagara Escarpment is one of Southern Ontario’s most important landscapes. Due to the nature of the landform and its location, the Escarpment is subject to various development pressures including urban expansion, mineral resource extraction, agricultural practices and recreation. In 1985, Canada’s first large scale environmentally based land use plan was put in place to ensure that only development that is compatible with the Escarpment occurred within the Niagara Escarpment Plan (NEP). The southern extent of the NEP is of particular interest in this study, since a portion of the Plan is located within the rapidly expanding Greater Toronto Area (GTA). The Plan area located in the Regional Municipalities of Hamilton and Halton represent both urban and rural geographical areas respectively, and are both experiencing development pressures and subsequent changes in land cover. Monitoring initiatives on the NEP have been established, but have done little to identify consistent techniques for monitoring land cover on the Niagara Escarpment. Land cover information is an important part of planning and environmental monitoring initiatives. Remote sensing has the potential to provide frequent and accurate land cover information over various spatial scales. The goal of this research was to examine land cover change in the Regional Municipalities of Hamilton and Halton portions of the NEP. This was achieved through the creation of land cover maps for each region using Landsat 5 Thematic Mapper (TM) remotely sensed data. These maps aided in determining the qualitative and quantitative changes that had occurred in the Plan area over a 20 year time period from 1986 to 2006. Change was also examined based on the NEP’s land use designations, to determine if the Plan policy has been effective in protecting the Escarpment. To obtain land cover maps, five different supervised classification methods were explored: Minimum Distance, Mahalanobis Distance, Maximum Likelihood, Object-oriented and Support Vector Machine. Seven land cover classes were mapped (forest, water, recreation, bare agricultural fields, vegetated agricultural fields, urban and mineral resource extraction areas) at a regional scale. SVM proved most successful at mapping land cover on the Escarpment, providing classification maps with an average accuracy of 86.7%. Land cover change analysis showed promising results with an increase in the forested class and only slight increases to the urban and mineral resource extraction classes. Negatively, there was a decrease in agricultural land overall. An examination of land cover change based on the NEP land use designations showed little change, other than change that is regulated under Plan policies, proving the success of the NEP for protecting vital Escarpment lands insofar as this can be revealed through remote sensing. Land cover should be monitored in the NEP consistently over time to ensure changes in the Plan area are compatible with the Niagara Escarpment. Remote sensing is a tool that can provide this information to the Niagara Escarpment Commission (NEC) in a timely, comprehensive and cost-effective way. The information gained from remotely sensed data can aid in environmental monitoring and policy planning into the future.
69

Sparse Modeling in Classification, Compression and Detection

Chen, Jihong 12 July 2004 (has links)
The principal focus of this thesis is the exploration of sparse structures in a variety of statistical modelling problems. While more comprehensive models can be useful to solve a larger number of problems, its calculation may be ill-posed in most practical instances because of the sparsity of informative features in the data. If this sparse structure can be exploited, the models can often be solved very efficiently. The thesis is composed of four projects. Firstly, feature sparsity is incorporated to improve the performance of support vector machines when there are a lot of noise features present. The second project is about an empirical study on how to construct an optimal cascade structure. The third project involves the design of a progressive, rate-distortionoptimized shape coder by combining zero-tree algorithm with beamlet structure. Finally, the longest run statistics is applied for the detection of a filamentary structure in twodimensional rectangular region. The fundamental ideas of the above projects are common — extract an efficient summary from a large amount of data. The main contributions of this work are to develop and implement novel techniques for the efficient solutions of several dicult problems that arise in statistical signal/image processing.
70

A Novelty Detection Approach to Seizure Analysis from Intracranial EEG

Gardner, Andrew Britton 12 April 2004 (has links)
A Novelty Detection Approach to Seizure Analysis from Intracranial EEG Andrew B. Gardner 146 pages Directed by Dr. George Vachtsevanos and Dr. Brian Litt A framework for support vector machine classification of time series events is proposed and applied to analyze physiological signals recorded from epileptic patients. In contrast to previous works, this research formulates seizure analysis as a novelty detection problem which allows seizure detection and prediction to be treated uniformly, in a way that is capable of accommodating multichannel and/or multimodal measurements. Theoretical properties of the support vector machine algorithm employed provide a straightforward means for controlling the false alarm rate of the detector. The resulting novelty detection system was evaluated both offline and online on a corpus of 1077 hours of intracranial electroencephalogram (IEEG) recordings from 12 patients diagnosed with medically resistant temporal lobe epilepsy during evaluation for epilepsy surgery. These patients collectively had 118 seizures during the recording period. The performance of the novelty detection framework was assessed with an emphasis on four key metrics: (1) sensitivity (probability of correct detection), (2) mean detection latency, (3) early-detection fraction (prediction or detection of seizure prior to electrographic onset), and (4) false positive rate. Both the offline and online novelty detectors achieved state-of-the-art seizure detection performance. In particular, the online detector achieved 97.85% sensitivity, -13.3 second latency, and 40% early-detection fraction at an average of 1.74 false positive predictions per hour (Fph). These results demonstrate that a novelty detection approach is not only feasible for seizure analysis, but it improves upon the state-of-the-art as an effective, robust technique. Additionally, an extension of the basic novelty detection framework demonstrated its use as a simple, effective tool for examining the spread of seizure onsets. This may be useful for automatically identifying seizure focus channels in patients with focal epilepsies. It is anticipated that this research will aid in localizing seizure onsets, and provide more efficient algorithms for use in a real device.

Page generated in 0.0767 seconds