• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 492
  • 148
  • 55
  • 49
  • 34
  • 30
  • 13
  • 12
  • 11
  • 7
  • 6
  • 6
  • 4
  • 3
  • 2
  • Tagged with
  • 1027
  • 1027
  • 658
  • 444
  • 371
  • 237
  • 229
  • 169
  • 169
  • 160
  • 147
  • 144
  • 139
  • 129
  • 119
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Machine Learning and Adaptive Signal Processing Methods for Electrocardiography Applications

Perumalla, Calvin A. 22 June 2017 (has links)
This dissertation is directed towards improving the state of art cardiac monitoring methods and automatic diagnosis of cardiac anomalies through modern engineering approaches such as adaptive signal processing, and machine learning methods. The dissertation will describe the invention and associated methods of a cardiac rhythm monitor dubbed the Integrated Vectorcardiogram (iVCG). In addition, novel machine learning approaches are discussed to improve diagnoses and prediction accuracy of cardiac diseases. It is estimated that around 17 million people in the world die from cardiac related events each year. It has also been shown that many of such deaths can be averted with long-term continuous monitoring and actuation. Hence, there is a growing need for better cardiac monitoring solutions. Leveraging the improvements in computational power, communication bandwidth, energy efficiency and electronic chip size in recent years, the Integrated Vectorcardiogram (iVCG) was invented as an answer to this problem. The iVCG is a miniaturized, integrated version of the Vectorcardiogram that was invented in the 1930s. The Vectorcardiogram provides full diagnostic quality cardiac information equivalent to that of the gold standard, 12-lead ECG, which is restricted to in-office use due to its bulky, obtrusive form. With the iVCG, it is possible to provide continuous, long-term, full diagnostic quality information, while being portable and unobtrusive to the patient. Moreover, it is possible to leverage this ‘Big Data’ and create machine learning algorithms to deliver better patient outcomes in the form of patient specific machine diagnosis and timely alerts. First, we present a proof-of-concept investigation for a miniaturized vectorcardiogram, the iVCG system for ambulatory on-body applications that continuously monitors the electrical activity of the heart in three dimensions. We investigate the minimum distance between a pair of leads in the X, Y and Z axes such that the signals are distinguishable from the noise. The target dimensions for our prototype iVCG are 3x3x2 cm and based on our experimental results we show that it is possible to achieve these dimensions. Following this, we present a solution to the problem of transforming the three VCG component signals to the familiar 12-lead ECG for the convenience of cardiologists. The least squares (LS) method is employed on the VCG signals and the reference (training) 12-lead ECG to obtain a 12x3 transformation matrix to generate the real-time ECG signals from the VCG signals. The iVCG is portable and worn on the chest of the patient and although a physician or trained technician will initially install it in the appropriate position, it is prone to subsequent rotation and displacement errors introduced by the patient placement of the device. We characterize these errors and present a software solution to correct the effect of the errors on the iVCG signals. We also describe the design of machine learning methods to improve automatic diagnosis and prediction of various heart conditions. Methods very similar to the ones described in this dissertation can be used on the long term, full diagnostic quality ‘Big Data’ such that the iVCG will be able to provide further insights into the health of patients. The iVCG system is potentially breakthrough and disruptive technology allowing long term and continuous remote monitoring of patient’s electrical heart activity. The implications are profound and include 1) providing a less expensive device compared to the 12-lead ECG system (the “gold standard”); 2) providing continuous, remote tele-monitoring of patients; 3) the replacement of current Holter shortterm monitoring system; 4) Improved and economic ICU cardiac monitoring; 5) The ability for patients to be sent home earlier from a hospital since physicians will have continuous remote monitoring of the patients.
82

A comparison of machine learning techniques for hand shape recognition

Foster, Roland January 2015 (has links)
>Magister Scientiae - MSc / There are five fundamental parameters that characterize any sign language gesture. They are hand shape, orientation, motion and location, and facial expressions. The SASL group at the University of the Western Cape has created systems to recognize each of these parameters in an input video stream. Most of these systems make use of the Support Vector Machine technique for the classification of data due to its high accuracy. It is, however, unknown how other machine learning techniques compare to Support Vector Machines in the recognition of each of these parameters. This research lays the foundation for the process of determining optimum machine learning techniques for each parameter by comparing Support Vector Machines to Artificial Neural Networks and Random Forests in the context of South African Sign Language hand shape recognition. Li, a previous researcher at the SASL group, created a state-of-the-art hand shape recognition system that uses Support Vector Machines to classify hand shapes. This research re-implements Li’s feature extraction procedure but investigates the use of Artificial Neural Networks and Random Forests in the place of Support Vector Machines as a comparison. The machine learning techniques are optimized and trained to recognize ten SASL hand shapes and compared in terms of classification accuracy, training time, optimization time and classification time.
83

Análisis de datos y búsqueda de patrones en aplicaciones médicas

García Ubilla, Arnol David January 2015 (has links)
Ingeniero Civil Matemático / El suicidio en Chile se ha convertido en uno de los problemas más necesarios de hacer frente en salud pública, más aún, si consideramos que la enorme mayoría de las personas que mueren por suicidio presentan algún diagnóstico psiquiátrico y han consultado a un especialista los meses antes de cometer suicidio. Esto, motiva la creación de indicadores y alertas para detectar de forma eficaz y oportuna cuando una persona ingresa a una zona de riesgo suicida. En el presente trabajo se aborda este problema, definiendo una zona o espectro de riesgo suicida, y generando modelos matemáticos y estadísticos para la detección de pacientes en esta zona de riesgo. Para esto, se utiliza una base de datos de 707 pacientes, consultantes de salud mental, de tres centros de salud distintos de la región metropolitana. La base de datos a su vez contempla 343 variables, incluyendo tanto información sociodemográfica de cada paciente, como también sus respuestas en siete instrumentos clínicos utilizados habitualmente en salud mental (DEQ, STAXI, OQ, RFL, APGAR, PBI Madre y PBI Padre). Inicialmente la base de datos es depurada eliminando aquellos campos y/o registros con gran porcentaje de valores nulos, mientras que la imputación de valores perdidos se realiza mediante técnicas tradicionales y en algunos casos según el criterio experto, donde se utiliza un método de imputación según valor de subescala para los distintos instrumentos clínicos. Posteriormente, se realiza una reducción de atributos mediante el uso de herramientas estadísticas y provenientes del machine learning. Con esta información, se generan cinco modelos utilizando distintas técnicas y herramientas del ámbito de la minería de datos y machine learning mediante aprendizaje supervisado. Los modelos son generados y calibrados usando el lenguaje estadístico R, y se comparan sus resultados mediante cuatro métricas distintas: precisión (o accuracy), sensibilidad, especificidad, y mediante su representación en el espacio ROC. El modelo o clasificador finalmente propuesto corresponde a un modelo de support vector machine, que permite discriminar cuando un paciente se encuentra en una zona de riesgo suicida. El modelo fue entrenado utilizando un kernel de tipo RBF, y utiliza tan sólo 22 variables predictoras, entregando una precisión aproximada del $78%, calculada mediante k-validación cruzada de n-folds con k=100 y n=10.
84

End-to-end single-rate multicast congestion detection using support vector machines

Liu, Xiaoming January 2008 (has links)
>Magister Scientiae - MSc / IP multicast is an efficient mechanism for simultaneously transmitting bulk data to multiple receivers. Many applications can benefit from multicast, such as audio and videoconferencing, multi-player games, multimedia broadcasting, distance education, and data replication. For either technical or policy reasons, IP multicast still has not yet been deployed in today’s Internet. Congestion is one of the most important issues impeding the development and deployment of IP multicast and multicast applications.
85

Stochastic functional descent for learning Support Vector Machines

He, Kun 22 January 2016 (has links)
We present a novel method for learning Support Vector Machines (SVMs) in the online setting. Our method is generally applicable in that it handles the online learning of the binary, multiclass, and structural SVMs in a unified view. The SVM learning problem consists of optimizing a convex objective function that is composed of two parts: the hinge loss and quadratic regularization. To date, the predominant family of approaches for online SVM learning has been gradient-based methods, such as Stochastic Gradient Descent (SGD). Unfortunately, we note that there are two drawbacks in such approaches: first, gradient-based methods are based on a local linear approximation to the function being optimized, but since the hinge loss is piecewise-linear and nonsmooth, this approximation can be ill-behaved. Second, existing online SVM learning approaches share the same problem formulation with batch SVM learning methods, and they all need to tune a fixed global regularization parameter by cross validation. On the one hand, global regularization is ineffective in handling local irregularities encountered in the online setting; on the other hand, even though the learning problem for a particular global regularization parameter value may be efficiently solved, repeatedly solving for a wide range of values can be costly. We intend to tackle these two problems with our approach. To address the first problem, we propose to perform implicit online update steps to optimize the hinge loss, as opposed to explicit (or gradient-based) updates that utilize subgradients to perform local linearization. Regarding the second problem, we propose to enforce local regularization that is applied to individual classifier update steps, rather than having a fixed global regularization term. Our theoretical analysis suggests that our classifier update steps progressively optimize the structured hinge loss, with the rate controlled by a sequence of regularization parameters; setting these parameters is analogous to setting the stepsizes in gradient-based methods. In addition, we give sufficient conditions for the algorithm's convergence. Experimentally, our online algorithm can match optimal classification performances given by other state-of-the-art online SVM learning methods, as well as batch learning methods, after only one or two passes over the training data. More importantly, our algorithm can attain these results without doing cross validation, while all other methods must perform time-consuming cross validation to determine the optimal choice of the global regularization parameter.
86

Discriminant Analysis and Support Vector Regression in High Dimensions: Sharp Performance Analysis and Optimal Designs

Sifaou, Houssem 04 1900 (has links)
Machine learning is emerging as a powerful tool to data science and is being applied in almost all subjects. In many applications, the number of features is com- parable to the number of samples, and both grow large. This setting is usually named the high-dimensional regime. In this regime, new challenges arise when it comes to the application of machine learning. In this work, we conduct a high-dimensional performance analysis of some popular classification and regression techniques. In a first part, discriminant analysis classifiers are considered. A major challenge towards the use of these classifiers in practice is that they depend on the inverse of covariance matrices that need to be estimated from training data. Several estimators for the inverse of the covariance matrices can be used. The most common ones are estimators based on the regularization approach. In this thesis, we propose new estimators that are shown to yield better performance. The main principle of our proposed approach is the design of an optimized inverse covariance matrix estimator based on the assumption that the covariance matrix is a low-rank perturbation of a scaled identity matrix. We show that not only the proposed classifiers are easier to implement but also, outperform the classical regularization-based discriminant analysis classifiers. In a second part, we carry out a high-dimensional statistical analysis of linear support vector regression. Under some plausible assumptions on the statistical dis- tribution of the data, we characterize the feasibility condition for the hard support vector regression and, when feasible, derive an asymptotic approximation for its risk. Similarly, we study the test risk for the soft support vector regression as a function of its parameters. The analysis is then extended to the case of kernel support vector regression under generalized linear models assumption. Based on our analysis, we illustrate that adding more samples may be harmful to the test performance of these regression algorithms, while it is always beneficial when the parameters are optimally selected. Our results pave the way to understand the effect of the underlying hyper- parameters and provide insights on how to optimally choose the kernel function.
87

Machine Learning Driven Model Inversion Methodology To Detect Reniform Nematodes In Cotton

Palacharla, Pavan Kumar 09 December 2011 (has links)
Rotylenchulus reniformis is a nematode species affecting the cotton crop and quickly spreading throughout the southeastern United States. Effective use of nematicides at a variable rate is the only economic counter measure. It requires the intraield variable nematode population, which in turn depends on the collection of soil samples from the field and analyzing them in the laboratory. This process is economically prohibitive. Hence estimating the nematode infestation on the cotton crop using remote sensing and machine learning techniques which are cost and time effective is the motivation for this study. In the current research, the concept of multi-temporal remote sensing has been implemented in order to design a robust and generalized Nematode detection regression model. Finally, a user friendly web-service is created which is gives trustworthy results for the given input data and thereby reducing the nematode infestation in the crop and their expenses on nematicides.
88

Evolutionary Optimization Of Support Vector Machines

Gruber, Fred 01 January 2004 (has links)
Support vector machines are a relatively new approach for creating classifiers that have become increasingly popular in the machine learning community. They present several advantages over other methods like neural networks in areas like training speed, convergence, complexity control of the classifier, as well as a stronger mathematical background based on optimization and statistical learning theory. This thesis deals with the problem of model selection with support vector machines, that is, the problem of finding the optimal parameters that will improve the performance of the algorithm. It is shown that genetic algorithms provide an effective way to find the optimal parameters for support vector machines. The proposed algorithm is compared with a backpropagation Neural Network in a dataset that represents individual models for electronic commerce.
89

SV-Means: A Fast One-Class Support Vector Machine-Based Level Set Estimator

Pavy, Anne M. January 2017 (has links)
No description available.
90

Detection and Classification of Sequence Variants for Diagnostic Evaluation of Genetic Disorders

Kothiyal, Prachi 05 August 2010 (has links)
No description available.

Page generated in 0.0371 seconds