• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 322
  • 18
  • 17
  • 17
  • 15
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 475
  • 475
  • 209
  • 206
  • 156
  • 136
  • 112
  • 90
  • 79
  • 74
  • 69
  • 66
  • 59
  • 57
  • 56
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Unsupervised Anomaly Detection and Explainability for Ladok Logs

Edholm, Mimmi January 2023 (has links)
Anomaly detection is the process of finding outliers in data. This report will explore the use of unsupervised machine learning for anomaly detection as well as the importance of explaining the decision making of the model. The project focuses on identifying anomalous behaviour in Ladok data from their frontend access logs, with emphasis on security issues, specifically attempted intrusion. This is done by implementing an anomaly detection model which consists of a stacked autoencoder and k-means clustering as well as examining the data using only k-means. In order to attempt to explain the decision making progress, SHAP is used. SHAP is a explainability model that measure the feature importance. The report will include an overview of the necessary theory of machine learning, anomaly detection and explainability, the implementation of the model as well as examine how to explain the process of the decision making in a black box model. Further, the results are presented and a discussion is held about how the models have performed on the data. Lastly, the report concludes whether the chosen approach has been appropriate and proposes how the work could be improved in future work. The study concludes that the results from this approach was not the desired outcome, and might therefore not be the most suitable.
72

Combining Static Analysis and Dynamic Learning to Build Context Sensitive Models of Program Behavior

Liu, Zhen 10 December 2005 (has links)
This dissertation describes a family of models of program behavior, the Hybrid Push Down Automata (HPDA) that can be acquired using a combination of static analysis and dynamic learning in order to take advantage of the strengths of both. Static analysis is used to acquire a base model of all behavior defined in the binary source code. Dynamic learning from audit data is used to supplement the base model to provide a model that exactly follows the definition in the executable but that includes legal behavior determined at runtime. Our model is similar to the VPStatic model proposed by Feng, Giffin, et al., but with different assumptions and organization. Return address information extracted from the program call stack and system call information are used to build the model. Dynamic learning alone or a combination of static analysis and dynamic learning can be used to acquire the model. We have shown that a new dynamic learning algorithm based on the assumption of a single entry point and exit point for each function can yield models of increased generality and can help reduce the false positive rate. Previous approaches based on static analysis typically work only with statically linked programs. We have developed a new component-based model and learning algorithm that builds separate models for dynamic libraries used in a program allowing the models to be shared by different program models. Sharing of models reduces memory usage when several programs are monitored, promotes reuse of library models, and simplifies model maintenance when the system updates dynamic libraries. Experiments demonstrate that the prototype detection system built with the HPDA approach has a performance overhead of less than 6% and can be used with complex real-world applications. When compared to other detection systems based on analysis of operating system calls, the HPDA approach is shown to converge faster during learning, to detect attacks that escape other detection systems, and to have a lower false positive rate.
73

Unsupervised Anomaly Detection in Numerical Datasets

Joshi, Vineet 05 June 2015 (has links)
No description available.
74

DCLAD: DISTRIBUTED CLUSTER BASED LOCALIZATION ANOMALY DETECTION IN WIRELESS SENSOR NETWORKS USING SINGLE MOBILE BEACON

PALADUGU, KARTHIKA January 2007 (has links)
No description available.
75

Two new approaches in anomaly detection with field data from bridges both in construction and service stages

Zhang, Fan 12 October 2015 (has links)
No description available.
76

Probabilistic Model for Detecting Network Traffic Anomalies

Yellapragada, Ramani 30 June 2004 (has links)
No description available.
77

Time-based Approach to Intrusion Detection using Multiple Self-Organizing Maps

Sawant, Ankush 21 April 2005 (has links)
No description available.
78

Robust Bayesian Anomaly Detection Methods for Large Scale Sensor Systems

Merkes, Sierra Nicole 12 September 2022 (has links)
Sensor systems, such as modern wind tunnels, require continual monitoring to validate their quality, as corrupted data will increase both experimental downtime and budget and lead to inconclusive scientific and engineering results. One approach to validate sensor quality is monitoring individual sensor measurements' distribution. Although, in general settings, we do not know how to correct measurements should be distributed for each sensor system. Instead of monitoring sensors individually, our approach relies on monitoring the co-variation of the entire network of sensor measurements, both within and across sensor systems. That is, by monitoring how sensors behave, relative to each other, we can detect anomalies expeditiously. Previous monitoring methodologies, such as those based on Principal Component Analysis, can be heavily influenced by extremely outlying sensor anomalies. We propose two Bayesian mixture model approaches that utilize heavy-tailed Cauchy assumptions. First, we propose a Robust Bayesian Regression, which utilizes a scale-mixture model to induce a Cauchy regression. Second, we extend elements of the Robust Bayesian Regression methodology using additive mixture models that decompose the anomalous and non-anomalous sensor readings into two parametric compartments. Specifically, we use a non-local, heavy-tailed Cauchy component for isolating the anomalous sensor readings, which we refer to as the Modified Cauchy Net. / Doctor of Philosophy / Sensor systems, such as modern wind tunnels, require continual monitoring to validate their quality, as corrupted data will increase both experimental downtime and budget and lead to inconclusive scientific and engineering results. One approach to validate sensor quality is monitoring individual sensor measurements' distribution. Although, in general settings, we do not know how to correct measurements should be distributed for each sensor system. Instead of monitoring sensors individually, our approach relies on monitoring the co-variation of the entire network of sensor measurements, both within and across sensor systems. That is, by monitoring how sensors behave, relative to each other, we can detect anomalies expeditiously. We proposed two Bayesian monitoring approaches called the Robust Bayesian Regression and Modified Cauchy Net, which provide flexible, tunable models for detecting anomalous sensors with the historical data containing anomalous observations.
79

The Cauchy-Net Mixture Model for Clustering with Anomalous Data

Slifko, Matthew D. 11 September 2019 (has links)
We live in the data explosion era. The unprecedented amount of data offers a potential wealth of knowledge but also brings about concerns regarding ethical collection and usage. Mistakes stemming from anomalous data have the potential for severe, real-world consequences, such as when building prediction models for housing prices. To combat anomalies, we develop the Cauchy-Net Mixture Model (CNMM). The CNMM is a flexible Bayesian nonparametric tool that employs a mixture between a Dirichlet Process Mixture Model (DPMM) and a Cauchy distributed component, which we call the Cauchy-Net (CN). Each portion of the model offers benefits, as the DPMM eliminates the limitation of requiring a fixed number of a components and the CN captures observations that do not belong to the well-defined components by leveraging its heavy tails. Through isolating the anomalous observations in a single component, we simultaneously identify the observations in the net as warranting further inspection and prevent them from interfering with the formation of the remaining components. The result is a framework that allows for simultaneously clustering observations and making predictions in the face of the anomalous data. We demonstrate the usefulness of the CNMM in a variety of experimental situations and apply the model for predicting housing prices in Fairfax County, Virginia. / Doctor of Philosophy / We live in the data explosion era. The unprecedented amount of data offers a potential wealth of knowledge but also brings about concerns regarding ethical collection and usage. Mistakes stemming from anomalous data have the potential for severe, real-world consequences, such as when building prediction models for housing prices. To combat anomalies, we develop the Cauchy-Net Mixture Model (CNMM). The CNMM is a flexible tool for identifying and isolating the anomalies, while simultaneously discovering cluster structure and making predictions among the nonanomalous observations. The result is a framework that allows for simultaneously clustering and predicting in the face of the anomalous data. We demonstrate the usefulness of the CNMM in a variety of experimental situations and apply the model for predicting housing prices in Fairfax County, Virginia.
80

Characterization of Laminated Magnetoelectric Vector Magnetometers to Assess Feasibility for Multi-Axis Gradiometer Configurations

Berry, David 29 December 2010 (has links)
Wide arrays of applications exist for sensing systems capable of magnetic field detection. A broad range of sensors are already used in this capacity, but future sensors need to increase sensitivity while remaining economical. A promising sensor system to meet these requirements is that of magnetoelectric (ME) laminates. ME sensors produce an electric field when a magnetic field is applied. While this ME effect exists to a limited degree in single phase materials, it is more easily achieved by laminating a magnetostrictive material, which deforms when exposed to a magnetic field, to a piezoelectric material. The transfer of strain from the magnetostrictive material to the piezoelectric material results in an electric field proportional to the induced magnetic field. Other fabrication techniques may impart the directionality needed to classify the ME sensor as a vector magnetometer. ME laminate sensors are more affordable to fabricate than competing vector magnetometers and with recent increases in sensitivity, have potential for use in arrays and gradiometer configurations. However, little is known about their total field detection, the effects of multiple sensors in close proximity and the signal processing needed for target localization. The goal for this project is to closely examine the single axis ME sensor response in different orientations with a moving magnetic dipole to assess the field detection capabilities. Multiple sensors were tested together to determine if the response characteristics are altered by the DC magnetic bias of ME sensors in close proximity. And finally, the ME sensor characteristics were compared to alternate vector magnetometers. / Master of Science

Page generated in 0.0711 seconds