• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4826
  • 1014
  • 627
  • 610
  • 522
  • 115
  • 105
  • 86
  • 64
  • 59
  • 36
  • 34
  • 34
  • 28
  • 25
  • Tagged with
  • 10180
  • 1716
  • 1405
  • 1123
  • 929
  • 882
  • 844
  • 836
  • 823
  • 795
  • 748
  • 745
  • 718
  • 677
  • 620
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Advances in detecting object classes and their semantic parts

Modolo, Davide January 2017 (has links)
Object classes are central to computer vision and have been the focus of substantial research in the last fifteen years. This thesis addresses the tasks of localizing entire objects in images (object class detection) and localizing their semantic parts (part detection). We present four contributions, two for each task. The first two improve existing object class detection techniques by using context and calibration. The other two contributions explore semantic part detection in weakly-supervised settings. First, the thesis presents a technique for predicting properties of objects in an image based on its global appearance only. We demonstrate the method by predicting three properties: aspect of appearance, location in the image and class membership. Overall, the technique makes multi-component object detectors faster and improves their performance. The second contribution is a method for calibrating the popular Ensemble of Exemplar- SVM object detector. Unlike the standard approach, which calibrates each Exemplar- SVM independently, our technique optimizes their joint performance as an ensemble. We devise an efficient optimization algorithm to find the global optimal solution of the calibration problem. This leads to better object detection performance compared to using independent calibration. The third innovation is a technique to train part-based model of object classes using data sourced from the web. We learn rich models incrementally. Our models encompass the appearance of parts and their spatial arrangement on the object, specific to each viewpoint. Importantly, it does not require any part location annotation, which is one of the main limits to training many part detectors. Finally, the last contribution is a study on whether semantic object parts emerge in Convolutional Neural Networks trained for higher-level tasks, such as image classification. While previous efforts studied this matter by visual inspection only, we perform an extensive quantitative analysis based on ground-truth part location annotations. This provides a more conclusive answer to the question.
12

A Novel Road Marking Detection and Recognition Technique Using a Camera-based Advanced Driver Assistance System

Tang, Zongzhi January 2017 (has links)
Advanced Driver Assistance System (ADAS) was widely learned nowadays. As crucial parts of ADAS, lane markings detection, as well as other objects detection, have become more popular than before. However, most methods implemented in such areas cannot perfectly balance the performance of accuracy versus efficiency, and the mainstream methods (e.g. Machine Learning) suffer from several limitations which can hardly break the wall between partial autonomous and fully autonomous driving. This thesis proposed a real-time lane marking detection framework for ADAS, which included 4-extreme points set descriptor and a rule-based cascade classifier. By analyzing the behavior of lane markings on the road surface, a characteristic of markings was discovered, i.e., standard markings can sustain their shape in the perpendicular plane of the driving direction. By employing this feature, a 4-extreme points set descriptor was applied to describe the shape of each marking first. Specifically, after processing Maximally Stable Extremal Region (MSER) and Hough transforms on a 2-D image, several contours of interest are obtained. A bounding box, with borders parallel to the image coordinate, intersected with each contour at 4 points in the edge, which was named 4-extreme points set. Afterward, to verify consistency of each contour and standard marking, some rules abstracted from construction manual are employed such as Area Filter, Colour Filter, Relative Location Filter, Convex Filter, etc. To reduce the errors caused by changes in driving direction, an enhanced module was then introduced. By tracking the vanishing point as well as other key points of the road net, a method for 3-D reconstruction, with respect to the optical axis between vanishing point and camera center, is possible. The principle of such algorithm was exhibited, and a description about how to obtain the depth information from this model was also provided. Among all of these processes, a key-point based classification method is the main contribution of this paper because of its function in eliminating the deformation of the object caused by inverse perspective mapping. Several experiments were conducted in highway and urban roads in Ottawa. The detection rate of the markings by the proposed algorithm reached an average accuracy rate of 96.77% while F1 Score (harmonic mean of precision and recall) also attained a rate of 90.57%. In summary, the proposed method exhibited a state-of-the-art performance and represents a significant advancement of understanding.
13

Radar detection and identification of human signatures using moving platforms

Gürbüz, Sevgi Zübeyde 17 August 2009 (has links)
Radar offers unique advantages over other sensors for the detection of humans, such as remote operation during virtually all weather and lighting conditions, increased range, and better coverage. Many current radar-based human detection systems employ some type of Fourier analysis, such as Doppler processing. However, in many environments, the signal-to-noise ratio (SNR) of human returns is quite low. Furthermore, Fourier-based techniques assume a linear variation in target phase over the aperture, whereas human targets have a highly nonlinear phase history. The resulting phase mismatch causes significant SNR loss in the detector itself. In this work, human target modeling is used to derive a more accurate non-linear approximation to the true target phase history. Two algorithms are proposed: a parameter estimation-based optimized non-linear phase (ONLP) detector, and a dictionary search-based enhanced optimized non-linear phase (EnONLP) detector. The ONLP algorithm optimizes the likelihood ratio over the unknown model parameters to derive a more accurate approximation to the expected human return. The EnONLP algorithm stores expected target signatures generated for each possible combination of model parameters in a dictionary, and then applies Orthogonal Matching Pursuit (OMP) to determine the optimal linear combination of dictionary entries that comprises the measured radar data. Thus, unlike the ONLP, the EnONLP algorithm also has the capability of detecting the presence of multiple human targets. Cramer-Rao bounds (CRB) on parameter estimates and receiver operating characteristics (ROC) curves are used to validate analytically the performance of both proposed methods to that of conventional, fully adaptive STAP. Finally, application of EnONLP to target characterization is illustrated.
14

Topics in underwater detection

Lourey, Simon J. Unknown Date (has links) (PDF)
This thesis presents methods for improving the detection processing of active sonar systems. Measures to compensate for or even exploit particular characteristics of the detection problem for these systems are considered. Reverberation is the result of scattering of the transmitted signal from non-target features. Multipath and variability are particularly pronounced for underwater sound signals because propagation is very sensitive to spatial and temporal temperature variations. Another problem is the low pulse repetition rate due to the relatively low speed of sound. This low data rate reduces tracking and detection performance. / Reverberation often arises as the sum of many small contributions so that received data has a multivariate Gaussian distribution. Estimating the large numbers of parameters in the distribution requires a lot of data. This data is not available because of the low data rate. Representing the scattering as an autoregressive process reduced the data requirement but at some cost to modelling accuracy. A coupled estimator algorithm is developed to estimate the parameters. Detection performance is compared to other models and estimators that assume Gaussian statistics. / To counter multipath distortion the delays and strength of the paths are estimated using a version of the expectation maximisation (EM) algorithm. The magnitude of path amplitudes is then used to decide if a target is present. The EM algorithm is also suggested as a way to find the likely amplitude of reverberation from a few large scatterers that that form non-Gaussian reverberation. / Non-parametric methods are considered for detection of short duration incoherent signals in a duct. These detectors compare the ranks of the data in a region being tested for target present to another region assumed to have no target. Simulations are used to explore performance and what happens when the independent samples assumption is violated by the presence of reverberation. / More data can improve detection. Exploiting data from multiple transmissions is difficult because the slow speed of sound allows targets to move out of detection cells between transmissions. Tracking the movements of potential targets can counter this problem. The usefulness of Integrated Probabalistic Data Association (IPDA), which calculates a probability of true track as well as track properties, is considered as a detection algorithm. Improvements when multiple receivers are used as well as limitations when sensor positions are uncertain are investigated.
15

Multispectral satellite image understanding

Unsalan, Cem January 2003 (has links)
No description available.
16

Détection des virus entériques dans les matrices alimentaires / Detection of enteric viruses in food matrices

Collette, Catherine 04 October 2017 (has links)
Les principaux virus entériques à l’origine de toxi-infections alimentaires collectives (TIAC) sont les norovirus (NoV) et les virus d’hépatites (VHA et VHE) responsables respectivement de gastro-entérites et d’hépatites. Ces virus entériques sont transmis principalement par la voie féco-orale directe ou par ingestion d‘eaux ou d’aliments consommés crus ou peu cuits tels que les mollusques et les végétaux. La norme CEN/ISO 15216 propose des méthodes d’extraction et de détection par RT-qPCR pour les NoV et le VHA dans les végétaux, l'eau et les mollusques bivalves. Mais en cas de TIAC, divers aliments peuvent être suspectés et ces dernières années le VHE dont le potentiel zoonotique a été démontré est devenu un virus d’intérêt en hygiène alimentaire. Les travaux de thèse ont pour objectif le développement de méthodes de diagnostic viral dans le domaine alimentaire.Ces travaux ont permis de développer des méthodes pour la détection des NoV dans les produits laitiers et pour l’extraction du VHA à partir de tomates semi-séchées, aliments incriminés dans des épidémies d’hépatite A en Europe. De plus, le développement d’une méthode d’extraction et de détection du VHE par RT-PCR quantitative a permis d’estimer la prévalence du VHE dans les figatelli et les saucisses sèches de foie à environ 30%, confortant ainsi la possibilité d’une transmission du VHE à l’homme via ces aliments.Dans la norme CEN/ISO, la validation du diagnostic viral repose en partie sur l’ajout d’un virus contrôle de processus afin de mesurer l’efficacité de l’extraction virale. Des travaux ont montré que le choix du virus contrôle de processus était fonction de la matrice alimentaire analysée et du virus recherché.Dans le but de rechercher simultanément les NoV de génogroupe I et II et le virus contrôle de processus, une RT-qPCR multiplex a été développée pour les eaux de consommation et a montré une sensibilité similaire à la RT-qPCR en simplex. Récemment, la digital PCR (RT-dPCR) a été décrite comme une nouvelle approche de quantification absolue des génomes qui ne nécessite pas l’utilisation de gamme standard. L’évaluation des performances de quantification des génomes du VHE par RT-dPCR en microfluidique dans des matrices à base de foie naturellement contaminées a montré que la RT-dPCR permettait la quantification du VHE avec une sensibilité similaire par rapport à la RT-qPCR.En conclusion, ces travaux ont permis de contribuer à l’amélioration du diagnostic viral dans le domaine alimentaire / The main enteric viruses that cause foodborne outbreaks are noroviruses (NoV) and viruses of hepatitis (VHA and VHE) respectively responsible of gastroenteritis and hepatitis. These enteric viruses are mainly transmitted by the direct fecal-oral way or by the ingestion of water or raw or undercooked food such as shellfish and vegetables. The standard CEN/ISO 15216 proposes methods of extraction and detection by RT-qPCR for NoV and VHA in vegetable, water and shellfish. But in case of foodborne outbreaks, different foods can be suspected and these last years the VHE the zoonotic potential of which was demonstrated became a virus of interest in food hygiene. This thesis works aim to propose development of methods of viral diagnosis in the food domain.These works allowed to develop methods for the detection of the NoV in dairy products and for the extraction of the VHA from semi-dried tomatoes, food were incriminated in epidemics of hepatitis A in Europe. Furthermore, the development of a method of extraction and detection of the VHE by quantitative RT-PCR allowed to estimate prevalence of the VHE in figatelli and dried sausages of liver at approximately 30 %, so consolidating the possibility of a transmission of the VHE to the man via this food.One of the general requirements for detecting these viruses in food involves the use of a process control virus to monitor the quality of the entire viral extraction procedure as described in the standard CEN/ISO. Studies showed that the choice of virus depends on food type and the screened pathogenic virus.An one-step multiplex RT-qPCR assay was developed in bottled and tap water for the simultaneous detection of NoV GI, NoV GII and process control virus and showed a sensibility similar to the RT-qPCR simplex. Recently, digital PCR (RT-dPCR) has been described as a novel approach to genome quantification with no need for a standard curve. The performance of microfluidic RT-dPCR was compared to RT-qPCR when detecting HEV in pig liver products. The sensitivity of the RT-dPCR assay was similar to that of RT-qPCR.To summarize, this work has contributed to the improvement of the viral diagnosis in the food microbiology
17

Featured anomaly detection methods and applications

Huang, Chengqiang January 2018 (has links)
Anomaly detection is a fundamental research topic that has been widely investigated. From critical industrial systems, e.g., network intrusion detection systems, to people’s daily activities, e.g., mobile fraud detection, anomaly detection has become the very first vital resort to protect and secure public and personal properties. Although anomaly detection methods have been under consistent development over the years, the explosive growth of data volume and the continued dramatic variation of data patterns pose great challenges on the anomaly detection systems and are fuelling the great demand of introducing more intelligent anomaly detection methods with distinct characteristics to cope with various needs. To this end, this thesis starts with presenting a thorough review of existing anomaly detection strategies and methods. The advantageous and disadvantageous of the strategies and methods are elaborated. Afterward, four distinctive anomaly detection methods, especially for time series, are proposed in this work aiming at resolving specific needs of anomaly detection under different scenarios, e.g., enhanced accuracy, interpretable results, and self-evolving models. Experiments are presented and analysed to offer a better understanding of the performance of the methods and their distinct features. To be more specific, the abstracts of the key contents in this thesis are listed as follows: 1) Support Vector Data Description (SVDD) is investigated as a primary method to fulfill accurate anomaly detection. The applicability of SVDD over noisy time series datasets is carefully examined and it is demonstrated that relaxing the decision boundary of SVDD always results in better accuracy in network time series anomaly detection. Theoretical analysis of the parameter utilised in the model is also presented to ensure the validity of the relaxation of the decision boundary. 2) To support a clear explanation of the detected time series anomalies, i.e., anomaly interpretation, the periodic pattern of time series data is considered as the contextual information to be integrated into SVDD for anomaly detection. The formulation of SVDD with contextual information maintains multiple discriminants which help in distinguishing the root causes of the anomalies. 3) In an attempt to further analyse a dataset for anomaly detection and interpretation, Convex Hull Data Description (CHDD) is developed for realising one-class classification together with data clustering. CHDD approximates the convex hull of a given dataset with the extreme points which constitute a dictionary of data representatives. According to the dictionary, CHDD is capable of representing and clustering all the normal data instances so that anomaly detection is realised with certain interpretation. 4) Besides better anomaly detection accuracy and interpretability, better solutions for anomaly detection over streaming data with evolving patterns are also researched. Under the framework of Reinforcement Learning (RL), a time series anomaly detector that is consistently trained to cope with the evolving patterns is designed. Due to the fact that the anomaly detector is trained with labeled time series, it avoids the cumbersome work of threshold setting and the uncertain definitions of anomalies in time series anomaly detection tasks.
18

Lightweight Network Intrusion Detection

Chen, Ya-lin 26 July 2005 (has links)
Exploit codes based on system vulnerabilities are often used by attackers to attack target computers or services. Such exploit programs often send attack packets in the first few packets right after a connection established with the target machine or service. And such attacks are often launched via Telnet service as well. A lightweight network-based intrusion detection system is proposed on detecting such attacks on Telnet traffic. The proposed system filters the first a few packets after each Telnet connection established and only uses partial data of a packet rather than total of it to detect intrusion, i.e. such design makes system load reduced a lot. This research is anomaly detection. The proposed system characterizes the normal traffic behavior and constructs it as a normal model based on the filtered normal traffic. In detection phase, the system examines the deviation of current filtered packet from the normal model via an anomaly score function, i.e. a more deviate packet will receive a higher anomaly score. Finally, we use 1999 DARPA Intrusion Detection Evaluation Data Set which contains 5 days of training data and 10 days of testing data, and 44 attack instances of 16 types of attacks, to evaluate our proposed system. The proposed system has the detection rate of 73% under a low false alarm rate of 2 false alarms per day; 80% for the hard detected attacks which are poorly detected in 1999 DARPA IDEP.
19

Anomaly Detection Through Statistics-Based Machine Learning For Computer Networks

Zhu, Xuejun January 2006 (has links)
The intrusion detection in computer networks is a complex research problem, which requires the understanding of computer networks and the mechanism of intrusions, the configuration of sensors and the collected data, the selection of the relevant attributes, and the monitor algorithms for online detection. It is critical to develop general methods for data dimension reduction, effective monitoring algorithms for intrusion detection, and means for their performance improvement. This dissertation is motivated by the timely need to develop statistics-based machine learning methods for effective detection of computer network anomalies.Three fundamental research issues related to data dimension reduction, control charts design and performance improvement have been addressed accordingly. The major research activities and corresponding contributions are summarized as follows:(1) Filter and Wrapper models are integrated to extract a small number of the informative attributes for computer network intrusion detection. A two-phase analyses method is proposed for the integration of Filter and Wrapper models. The proposed method has successfully reduced the original 41 attributes to 12 informative attributes while increasing the accuracy of the model. The comparison of the results in each phase shows the effectiveness of the proposed method.(2) Supervised kernel based control charts for anomaly intrusion detection. We propose to construct control charts in a feature space. The first contribution is the use of multi-objective Genetic Algorithm in the parameter pre-selection for SVM based control charts. The second contribution is the performance evaluation of supervised kernel based control charts.(3) Unsupervised kernel based control charts for anomaly intrusion detection. Two types of unsupervised kernel based control charts are investigated: Kernel PCA control charts and Support Vector Clustering based control charts. The applications of SVC based control charts on computer networks audit data are also discussed to demonstrate the effectiveness of the proposed method.Although the developed methodologies in this dissertation are demonstrated in the computer network intrusion detection applications, the methodologies are also expected to be applied to other complex system monitoring, where the database consists of a large dimensional data with non-Gaussian distribution.
20

An investigation of the performance limits of detectors for use in radiometry in the frequency range 60-300 GHz

Gong, Y. Q. January 1983 (has links)
No description available.

Page generated in 0.0737 seconds