• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Advancing the Utility of Manufacturing Data for Modeling, Monitoring, and Securing Machining Processes

Shafae, Mohammed Saeed Abuelmakarm 23 August 2018 (has links)
The growing adoption of smart manufacturing systems and its related technologies (e.g., embedded sensing, internet-of-things, cyber-physical systems, big data analytics, and cloud computing) is promising a paradigm shift in the manufacturing industry. Such systems enable extracting and exchanging actionable knowledge across the different entities of the manufacturing cyber-physical system and beyond. From a quality control perspective, this allows for more opportunities to realize proactive product design; real-time process monitoring, diagnosis, prognosis, and control; and better product quality characterization. However, a multitude of challenges are arising, with the growing adoption of smart manufacturing, including industrial data characterized by increasing volume, velocity, variety, and veracity, as well as the security of the manufacturing system in the presence of growing connectivity. Taking advantage of these emerging opportunities and tackling the upcoming challenges require creating novel quality control and data analytics methods, which not only push the boundaries of the current state-of-the-art research, but discover new ways to analyze the data and utilize it. One of the key pillars of smart manufacturing systems is real-time automated process monitoring, diagnosis, and control methods for process/product anomalies. For machining applications, traditionally, deterioration in quality measures may occur due to a variety of assignable causes of variation such as poor cutting tool replacement decisions and inappropriate choice cutting parameters. Additionally, due to increased connectivity in modern manufacturing systems, process/product anomalies intentionally induced through malicious cyber-attacks -- aiming at degrading the process performance and/or the part quality -- is becoming a growing concern in the manufacturing industry. Current methods for detecting and diagnosing traditional causes of anomalies are primarily lab-based and require experts to perform initial set-ups and continual fine-tuning, reducing the applicability in industrial shop-floor applications. As for efforts accounting for process/product anomalies due cyber-attacks, these efforts are in early stages. Therefore, more foundational research is needed to develop a clear understanding of this new type of cyber-attacks and their effects on machining processes, to ensure smart manufacturing security both on the cyber and the physical levels. With primary focus on machining processes, the overarching goal of this dissertation work is to explore new ways to expand the use and value of manufacturing data-driven methods for better applicability in industrial shop-floors and increased security of smart manufacturing systems. As a first step toward achieving this goal, the work in this dissertation focuses on adopting this goal in three distinct areas of interest: (1) Statistical Process Monitoring of Time-Between-Events Data (e.g., failure-time data); (2) Defending against Product-Oriented Cyber-Physical Attacks on Intelligent Machining Systems; and (3) Modeling Machining Process Data: Time Series vs. Spatial Point Cloud Data Structures. / PHD / Recent advancements in embedded sensing, internet-of-things, big data analytics, cloud computing, and communication technologies and methodologies are shifting the modern manufacturing industry toward a novel operational paradigm. Several terms have been coined to refer to this new paradigm such as cybermanufacturing, industry 4.0, industrial internet of things, industrial internet, or more generically smart manufacturing (term to be used henceforth). The overarching goal of smart manufacturing is to transform modern manufacturing systems to knowledge-enabled Cyber-Physical Systems (CPS), in which humans, machines, equipment, and products communicate and cooperate together in real-time, to make decentralized decisions resulting in profound improvements in the entire manufacturing ecosystem. From a quality control perspective, this allows for more opportunities to utilize manufacturing process data to realize proactive product design; real-time process monitoring, diagnosis, prognosis, and control; and better product quality characterization. With primary focus on machining processes, the overarching goal of this work is to explore new ways to expand the use and value of manufacturing data-driven methods for better applicability in industrial shop-floors and increased security of smart manufacturing systems. As a first step toward achieving this goal, the work in this dissertation focuses on three distinct areas of interest: (1) Monitoring of time-between-events data of mechanical components replacements (e.g., failure-time data); (2) Defending against cyber-physical attacks on intelligent machining systems aiming at degrading machined parts quality; and (3) Modeling machining process data using two distinct data structures, namely, time series and spatial point cloud data.
2

Quality Control Tools for Cyber-Physical Security of Production Systems

Elhabashy, Ahmed Essam 15 January 2019 (has links)
With recent advancements in computer and network technologies, cyber-physical systems have become more susceptible to cyber-attacks; and production systems are no exception. Unlike traditional Information Technology (IT) systems, cyber-physical systems are not limited to attacks aimed at Intellectual Property (IP) theft, but also include attacks that maliciously affect the physical world. In manufacturing, such cyber-physical attacks can destroy equipment, force dimensional product changes, alter a product's mechanical characteristics, or endanger human lives. The manufacturing industry often relies on modern Quality Control (QC) tools to protect against quality losses, such as those that can occur from an attack. However, cyber-physical attacks can still be designed to avoid detection by traditional QC methods, which suggests a strong need for new and more robust QC tools. Such new tools should be able to prevent, or at least minimize, the effects of cyber-physical attacks on production systems. Unfortunately, little to no research has been done on using QC tools for cyber-physical security of production systems. Hence, the overarching goal of this work is to allow QC systems to be designed and used effectively as a second line of defense, when traditional cyber-security techniques fail and the production system is already breached. To this end, this work focuses on: 1) understanding the role of QC systems in cyber-physical attacks within manufacturing through developing a taxonomy encompassing the different layers involved; 2) identifying existing weaknesses in QC tools and exploring the effects of exploiting them by cyber-physical attacks; and 3) proposing more effective QC tools that can overcome existing weaknesses by introducing randomness to the tools, for better security against cyber-physical attacks in manufacturing. / Ph. D. / The recent technological developments in computers and networking have made systems, such as production systems, more vulnerable to attacks having both cyber and physical components; i.e., to cyber-physical attacks. In manufacturing, such attacks are not only capable of stealing valuable information, but can also destroy equipment, force physical product changes, alter product’s mechanical characteristics, or endanger human lives. Typically, the manufacturing industry have relied on various Quality Control (QC) tools, such as product inspection, to detect the effects caused by these attacks. However, these attacks could be still designed in a way to avoid detection by traditional QC methods, which suggests a need for new and more effective QC tools. Such new tools should be able to prevent, or at least minimize, the effects of these attacks in manufacturing. Unfortunately, almost no research has been done on using QC tools for securing production systems against these malicious attacks. Hence, the overarching goal of this work is to allow QC systems to be designed in a more effective manner to act as a second line of defense, when traditional cyber-security measures and attackers have already accessed the production system. To this end, this work focuses on: 1) understanding the role of QC systems during the attack; 2) identifying existing weaknesses in QC tools and determining the effects of exploiting them by the attack; and 3) proposing more effective QC tools, for better protection against these types of cyber-physical attacks in manufacturing.
3

Data Analytics for Statistical Learning

Komolafe, Tomilayo A. 05 February 2019 (has links)
The prevalence of big data has rapidly changed the usage and mechanisms of data analytics within organizations. Big data is a widely-used term without a clear definition. The difference between big data and traditional data can be characterized by four Vs: velocity (speed at which data is generated), volume (amount of data generated), variety (the data can take on different forms), and veracity (the data may be of poor/unknown quality). As many industries begin to recognize the value of big data, organizations try to capture it through means such as: side-channel data in a manufacturing operation, unstructured text-data reported by healthcare personnel, various demographic information of households from census surveys, and the range of communication data that define communities and social networks. Big data analytics generally follows this framework: first, a digitized process generates a stream of data, this raw data stream is pre-processed to convert the data into a usable format, the pre-processed data is analyzed using statistical tools. In this stage, called statistical learning of the data, analysts have two main objectives (1) develop a statistical model that captures the behavior of the process from a sample of the data (2) identify anomalies in the process. However, several open challenges still exist in this framework for big data analytics. Recently, data types such as free-text data are also being captured. Although many established processing techniques exist for other data types, free-text data comes from a wide range of individuals and is subject to syntax, grammar, language, and colloquialisms that require substantially different processing approaches. Once the data is processed, open challenges still exist in the statistical learning step of understanding the data. Statistical learning aims to satisfy two objectives, (1) develop a model that highlights general patterns in the data (2) create a signaling mechanism to identify if outliers are present in the data. Statistical modeling is widely utilized as researchers have created a variety of statistical models to explain everyday phenomena such as predicting energy usage behavior, traffic patterns, and stock market behaviors, among others. However, new applications of big data with increasingly varied designs present interesting challenges. Consider the example of free-text analysis posed above. There's a renewed interest in modeling free-text narratives from sources such as online reviews, customer complaints, or patient safety event reports, into intuitive themes or topics. As previously mentioned, documents describing the same phenomena can vary widely in their word usage and structure. Another recent interest area of statistical learning is using the environmental conditions that people live, work, and grow in, to infer their quality of life. It is well established that social factors play a role in overall health outcomes, however, clinical applications of these social determinants of health is a recent and an open problem. These examples are just a few of many examples wherein new applications of big data pose complex challenges requiring thoughtful and inventive approaches to processing, analyzing, and modeling data. Although a large body of research exists in the area of anomaly detection increasingly complicated data sources (such as side-channel related data or network-based data) present equally convoluted challenges. For effective anomaly-detection, analysts define parameters and rules, so that when large collections of raw data are aggregated, pieces of data that do not conform are easily noticed and flagged. In this work, I investigate the different steps of the data analytics framework and propose improvements for each step, paired with practical applications, to demonstrate the efficacy of my methods. This paper focuses on the healthcare, manufacturing and social-networking industries, but the materials are broad enough to have wide applications across data analytics generally. My main contributions can be summarized as follows: • In the big data analytics framework, raw data initially goes through a pre-processing step. Although many pre-processing techniques exist, there are several challenges in pre-processing text data and I develop a pre-processing tool for text data. • In the next step of the data analytics framework, there are challenges in both statistical modeling and anomaly detection o I address the research area of statistical modeling in two ways: - There are open challenges in defining models to characterize text data. I introduce a community extraction model that autonomously aggregates text documents into intuitive communities/groups - In health care, it is well established that social factors play a role in overall health outcomes however developing a statistical model that characterizes these relationships is an open research area. I developed statistical models for generalizing relationships between social determinants of health of a cohort and general medical risk factors o I address the research area of anomaly detection in two ways: - A variety of anomaly detection techniques exist already, however, some of these methods lack a rigorous statistical investigation thereby making them ineffective to a practitioner. I identify critical shortcomings to a proposed network based anomaly detection technique and introduce methodological improvements - Manufacturing enterprises which are now more connected than ever are vulnerably to anomalies in the form of cyber-physical attacks. I developed a sensor-based side-channel technique for anomaly detection in a manufacturing process / PHD / The prevalence of big data has rapidly changed the usage and mechanisms of data analytics within organizations. The fields of manufacturing and healthcare are two examples of industries that are currently undergoing significant transformations due to the rise of big data. The addition of large sensory systems is changing how parts are being manufactured and inspected and the prevalence of Health Information Technology (HIT) systems in healthcare systems is also changing the way healthcare services are delivered. These industries are turning to big data analytics in the hopes of acquiring many of the benefits other sectors are experiencing, including reducing cost, improving safety, and boosting productivity. However, there are many challenges that exist along with the framework of big data analytics, from pre-processing raw data, to statistical modeling of the data, and identifying anomalies present in the data or process. This work offers significant contributions in each of the aforementioned areas and includes practical real-world applications. Big data analytics generally follows this framework: first, a digitized process generates a stream of data, this raw data stream is pre-processed to convert the data into a usable format, the pre-processed data is analyzed using statistical tools. In this stage, called ‘statistical learning of the data’, analysts have two main objectives (1) develop a statistical model that captures the behavior of the process from a sample of the data (2) identify anomalies or outliers in the process. In this work, I investigate the different steps of the data analytics framework and propose improvements for each step, paired with practical applications, to demonstrate the efficacy of my methods. This work focuses on the healthcare and manufacturing industries, but the materials are broad enough to have wide applications across data analytics generally. My main contributions can be summarized as follows: • In the big data analytics framework, raw data initially goes through a pre-processing step. Although many pre-processing techniques exist, there are several challenges in pre-processing text data and I develop a pre-processing tool for text data. • In the next step of the data analytics framework, there are challenges in both statistical modeling and anomaly detection o I address the research area of statistical modeling in two ways: - There are open challenges in defining models to characterize text data. I introduce a community extraction model that autonomously aggregates text documents into intuitive communities/groups - In health care, it is well established that social factors play a role in overall health outcomes however developing a statistical model that characterizes these relationships is an open research area. I developed statistical models for generalizing relationships between social determinants of health of a cohort and general medical risk factors o I address the research area of anomaly detection in two ways: - A variety of anomaly detection techniques exist already, however, some of these methods lack a rigorous statistical investigation thereby making them ineffective to a practitioner. I identify critical shortcomings to a proposed network-based anomaly detection technique and introduce methodological improvements - Manufacturing enterprises which are now more connected than ever are vulnerable to anomalies in the form of cyber-physical attacks. I developed a sensor-based side-channel technique for anomaly detection in a manufacturing process.

Page generated in 0.0657 seconds