• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 80
  • 46
  • 20
  • 9
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 202
  • 202
  • 202
  • 76
  • 60
  • 56
  • 55
  • 52
  • 29
  • 26
  • 25
  • 24
  • 21
  • 21
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Profile Monitoring for Mixed Model Data

Jensen, Willis Aaron 26 April 2006 (has links)
The initial portion of this research focuses on appropriate parameter estimators within a general context of multivariate quality control. The goal of Phase I analysis of multivariate quality control data is to identify multivariate outliers and step changes so that the estimated control limits are sufficiently accurate for Phase II monitoring. High breakdown estimation methods based on the minimum volume ellipsoid (MVE) or the minimum covariance determinant (MCD) are well suited to detecting multivariate outliers in data. Because of the inherent difficulties in computation many algorithms have been proposed to obtain them. We consider the subsampling algorithm to obtain the MVE estimators and the FAST-MCD algorithm to obtain the MCD estimators. Previous studies have not clearly determined which of these two estimation methods is best for control chart applications. The comprehensive simulation study here gives guidance for when to use which estimator. Control limits are provided. High breakdown estimation methods such as MCD and MVE can be applied to a wide variety of multivariate quality control data. The final, lengthier portion of this research considers profile monitoring. Profile monitoring is a relatively new technique in quality control used when the product or process quality is best represented by a profile (or a curve) at each time period. The essential idea is often to model the profile via some parametric method and then monitor the estimated parameters over time to determine if there have been changes in the profiles. Because the estimated parameters may be correlated, it is convenient to monitor them using a multivariate control method such as the T-squared statistic. Previous modeling methods have not incorporated the correlation structure within the profiles. We propose the use of mixed models (both linear and nonlinear) to monitor linear and nonlinear profiles in order to account for the correlation structure within a profile. We consider various data scenarios and show using simulation when the mixed model approach is preferable to an approach that ignores the correlation structure. Our focus is on Phase I control chart applications. / Ph. D.
72

The Monitoring of Linear Profiles and the Inertial Properties of Control Charts

Mahmoud, Mahmoud A. 17 November 2004 (has links)
The Phase I analysis of data when the quality of a process or product is characterized by a linear function is studied in this dissertation. It is assumed that each sample collected over time in the historical data set consists of several bivariate observations for which a simple linear regression model is appropriate, a situation common in calibration applications. Using a simulation study, the researcher compares the performance of some of the recommended approaches used to assess the stability of the process. Also in this dissertation, a method based on using indicator variables in a multiple regression model is proposed. This dissertation also proposes a change point approach based on the segmented regression technique for testing the constancy of the regression parameters in a linear profile data set. The performance of the proposed change point method is compared to that of the most effective Phase I linear profile control chart approaches using a simulation study. The advantage of the proposed change point method over the existing methods is greatly improved detection of sustained step changes in the process parameters. Any control chart that combines sample information over time, e.g., the cumulative sum (CUSUM) chart and the exponentially weighted moving average (EWMA) chart, has an ability to detect process changes that varies over time depending on the past data observed. The chart statistics can take values such that some shifts in the parameters of the underlying probability distribution of the quality characteristic are more difficult to detect. This is referred to as the "inertia problem" in the literature. This dissertation shows under realistic assumptions that the worst-case run length performance of control charts becomes as informative as the steady-state performance. Also this study proposes a simple new measure of the inertial properties of control charts, namely the signal resistance. The conclusions of this study support the recommendation that Shewhart limits should be used with EWMA charts, especially when the smoothing parameter is small. This study also shows that some charts proposed by Pignatiello and Runger (1990) and Domangue and Patch (1991) have serious disadvantages with respect to inertial properties. / Ph. D.
73

Advancing Manufacturing Quality Control Capabilities Through The Use Of In-Line High-Density Dimensional Data

Wells, Lee Jay 15 January 2014 (has links)
Through recent advancements in high-density dimensional (HDD) measurement technologies, such as 3D laser scanners, data-sets consisting of an almost complete representation of a manufactured part's geometry can now be obtained. While HDD data measurement devices have traditionally been used in reverse engineering application, they are beginning to be applied as in-line measurement devices. Unfortunately, appropriate quality control (QC) techniques have yet to be developed to take full advantage of this new data-rich environment and for the most part rely on extracting discrete key product characteristics (KPCs) for analysis. In order to maximize the potential of HDD measurement technologies requires a new quality paradigm. Specifically, when presented with HDD data, quality should not only be assessed by discrete KPCs but should consider the entire part being produced, anything less results in valuable data being wasted. This dissertation addresses the need for adapting current techniques and developing new approaches for the use of HDD data in manufacturing systems to increase overall quality control (QC) capabilities. Specifically, this research effort focuses on the use of HDD data for 1) Developing a framework for self-correcting compliant assembly systems, 2) Using statistical process control to detect process shifts through part surfaces, and 3) Performing automated part inspection for non-feature based faults. The overarching goal of this research is to identify how HDD data can be used within these three research focus areas to increase QC capabilities while following the principles of the aforementioned new quality paradigm. / Ph. D.
74

Surveillance of Poisson and Multinomial Processes

Ryan, Anne Garrett 18 April 2011 (has links)
As time passes, change occurs. With this change comes the need for surveillance. One may be a technician on an assembly line and in need of a surveillance technique to monitor the number of defective components produced. On the other hand, one may be an administrator of a hospital in need of surveillance measures to monitor the number of patient falls in the hospital or to monitor surgical outcomes to detect changes in surgical failure rates. A natural choice for on-going surveillance is the control chart; however, the chart must be constructed in a way that accommodates the situation at hand. Two scenarios involving attribute control charting are investigated here. The first scenario involves Poisson count data where the area of opportunity changes. A modified exponentially weighted moving average (EWMA) chart is proposed to accommodate the varying sample sizes. The performance of this method is compared with the performance for several competing control chart techniques and recommendations are made regarding the best preforming control chart method. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). The second scenario involves monitoring a process where items are classified into more than two categories and the results for these classifications are readily available. A multinomial cumulative sum (CUSUM) chart is proposed to monitor these types of situations. The multinomial CUSUM chart is evaluated through comparisons of performance with competing control chart methods. This research is a result of joint work with Mr. Lee J. Wells (Grado Department of Industrial and Systems Engineering, Virginia Tech) and Dr. William H. Woodall (Department of Statistics, Virginia Tech). / Ph. D.
75

Improving Post-Disaster Recovery: Decision Support for Debris Disposal Operations

Fetter, Gary 07 May 2010 (has links)
Disaster debris cleanup operations are commonly organized into two phases. During the first phase, the objective is to clear debris from evacuation and other important pathways to ensure access to the disaster-affected area. Practically, Phase 1 activities largely consist of pushing fallen trees, vehicles, and other debris blocking streets and highways to the curb. These activities begin immediately once the disaster has passed, with the goal of completion usually within 24 to 72 hours. In Phase 2 of debris removal, which is the focus of this study, completion can take months or years. Activities in this phase include organizing and managing curbside debris collection, reduction, recycling, and disposal operations (FEMA 2007). This dissertation research investigates methods for improving post-disaster debris cleanup operations—one of the most important and costly aspects of the least researched area of disaster operations management (Altay and Green 2006). The first objective is to identify the unique nature of the disaster debris cleanup problem and the important decisions faced by disaster debris coordinators. The second goal is to present three research projects that develop methods for assisting disaster management coordinators with debris cleanup operations. In the first project, which is the topic of Chapter 3, a facility location model is developed for addressing the problem of opening temporary disposal and storage reduction facilities, which are needed to ensure efficient and effective cleanup operations. In the second project, which is the topic of Chapter 4, a multiple objective mixed-integer linear programming model is developed to address the problem of assigning debris cleanup resources across the disaster-affected area at the onset of debris cleanup operations. The third project and the focus of Chapter 5 addresses the problem of equitably controlling ongoing cleanup operations in real-time. A self-balancing CUSUM statistical process control chart is developed to assist disaster management coordinators with equitably allocating cleanup resources as information becomes available in real-time. All of the models in this dissertation are evaluated using data from debris cleanup operations in Chesapeake, Virginia, completed after Hurricane Isabel in 2003. / Ph. D.
76

Dynamic Probability Control Limits for Risk-Adjusted Bernoulli Cumulative Sum Charts

Zhang, Xiang 12 December 2015 (has links)
The risk-adjusted Bernoulli cumulative sum (CUSUM) chart developed by Steiner et al. (2000) is an increasingly popular tool for monitoring clinical and surgical performance. In practice, however, use of a fixed control limit for the chart leads to quite variable in-control average run length (ARL) performance for patient populations with different risk score distributions. To overcome this problem, the simulation-based dynamic probability control limits (DPCLs) patient-by-patient for the risk-adjusted Bernoulli CUSUM charts is determined in this study. By maintaining the probability of a false alarm at a constant level conditional on no false alarm for previous observations, the risk-adjusted CUSUM charts with DPCLs have consistent in-control performance at the desired level with approximately geometrically distributed run lengths. Simulation results demonstrate that the proposed method does not rely on any information or assumptions about the patients' risk distributions. The use of DPCLs for risk-adjusted Bernoulli CUSUM charts allows each chart to be designed for the corresponding particular sequence of patients for a surgeon or hospital. The effect of estimation error on performance of risk-adjusted Bernoulli CUSUM chart with DPCLs is also examined. Our simulation results show that the in-control performance of risk-adjusted Bernoulli CUSUM chart with DPCLs is affected by the estimation error. The most influential factors are the specified desired in-control average run length, the Phase I sample size and the overall adverse event rate. However, the effect of estimation error is uniformly smaller for the risk-adjusted Bernoulli CUSUM chart with DPCLs than for the corresponding chart with a constant control limit under various realistic scenarios. In addition, there is a substantial reduction in the standard deviation of the in-control run length when DPCLs are used. Therefore, use of DPCLs has yet another advantage when designing a risk-adjusted Bernoulli CUSUM chart. These researches are results of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech). Moreover, DPCLs are adapted to design the risk-adjusted CUSUM charts for multiresponses developed by Tang et al. (2015). It is shown that the in-control performance of the charts with DPCLs can be controlled for different patient populations because these limits are determined for each specific sequence of patients. Thus, the risk-adjusted CUSUM chart for multiresponses with DPCLs is more practical and should be applied to effectively monitor surgical performance by hospitals and healthcare practitioners. This research is a result of joint work with Dr. William H. Woodall (Department of Statistics, Virginia Tech) and Mr. Justin Loda (Department of Statistics, Virginia Tech). / Ph. D.
77

Advancing the Utility of Manufacturing Data for Modeling, Monitoring, and Securing Machining Processes

Shafae, Mohammed Saeed Abuelmakarm 23 August 2018 (has links)
The growing adoption of smart manufacturing systems and its related technologies (e.g., embedded sensing, internet-of-things, cyber-physical systems, big data analytics, and cloud computing) is promising a paradigm shift in the manufacturing industry. Such systems enable extracting and exchanging actionable knowledge across the different entities of the manufacturing cyber-physical system and beyond. From a quality control perspective, this allows for more opportunities to realize proactive product design; real-time process monitoring, diagnosis, prognosis, and control; and better product quality characterization. However, a multitude of challenges are arising, with the growing adoption of smart manufacturing, including industrial data characterized by increasing volume, velocity, variety, and veracity, as well as the security of the manufacturing system in the presence of growing connectivity. Taking advantage of these emerging opportunities and tackling the upcoming challenges require creating novel quality control and data analytics methods, which not only push the boundaries of the current state-of-the-art research, but discover new ways to analyze the data and utilize it. One of the key pillars of smart manufacturing systems is real-time automated process monitoring, diagnosis, and control methods for process/product anomalies. For machining applications, traditionally, deterioration in quality measures may occur due to a variety of assignable causes of variation such as poor cutting tool replacement decisions and inappropriate choice cutting parameters. Additionally, due to increased connectivity in modern manufacturing systems, process/product anomalies intentionally induced through malicious cyber-attacks -- aiming at degrading the process performance and/or the part quality -- is becoming a growing concern in the manufacturing industry. Current methods for detecting and diagnosing traditional causes of anomalies are primarily lab-based and require experts to perform initial set-ups and continual fine-tuning, reducing the applicability in industrial shop-floor applications. As for efforts accounting for process/product anomalies due cyber-attacks, these efforts are in early stages. Therefore, more foundational research is needed to develop a clear understanding of this new type of cyber-attacks and their effects on machining processes, to ensure smart manufacturing security both on the cyber and the physical levels. With primary focus on machining processes, the overarching goal of this dissertation work is to explore new ways to expand the use and value of manufacturing data-driven methods for better applicability in industrial shop-floors and increased security of smart manufacturing systems. As a first step toward achieving this goal, the work in this dissertation focuses on adopting this goal in three distinct areas of interest: (1) Statistical Process Monitoring of Time-Between-Events Data (e.g., failure-time data); (2) Defending against Product-Oriented Cyber-Physical Attacks on Intelligent Machining Systems; and (3) Modeling Machining Process Data: Time Series vs. Spatial Point Cloud Data Structures. / PHD / Recent advancements in embedded sensing, internet-of-things, big data analytics, cloud computing, and communication technologies and methodologies are shifting the modern manufacturing industry toward a novel operational paradigm. Several terms have been coined to refer to this new paradigm such as cybermanufacturing, industry 4.0, industrial internet of things, industrial internet, or more generically smart manufacturing (term to be used henceforth). The overarching goal of smart manufacturing is to transform modern manufacturing systems to knowledge-enabled Cyber-Physical Systems (CPS), in which humans, machines, equipment, and products communicate and cooperate together in real-time, to make decentralized decisions resulting in profound improvements in the entire manufacturing ecosystem. From a quality control perspective, this allows for more opportunities to utilize manufacturing process data to realize proactive product design; real-time process monitoring, diagnosis, prognosis, and control; and better product quality characterization. With primary focus on machining processes, the overarching goal of this work is to explore new ways to expand the use and value of manufacturing data-driven methods for better applicability in industrial shop-floors and increased security of smart manufacturing systems. As a first step toward achieving this goal, the work in this dissertation focuses on three distinct areas of interest: (1) Monitoring of time-between-events data of mechanical components replacements (e.g., failure-time data); (2) Defending against cyber-physical attacks on intelligent machining systems aiming at degrading machined parts quality; and (3) Modeling machining process data using two distinct data structures, namely, time series and spatial point cloud data.
78

Quality Control Tools for Cyber-Physical Security of Production Systems

Elhabashy, Ahmed Essam 15 January 2019 (has links)
With recent advancements in computer and network technologies, cyber-physical systems have become more susceptible to cyber-attacks; and production systems are no exception. Unlike traditional Information Technology (IT) systems, cyber-physical systems are not limited to attacks aimed at Intellectual Property (IP) theft, but also include attacks that maliciously affect the physical world. In manufacturing, such cyber-physical attacks can destroy equipment, force dimensional product changes, alter a product's mechanical characteristics, or endanger human lives. The manufacturing industry often relies on modern Quality Control (QC) tools to protect against quality losses, such as those that can occur from an attack. However, cyber-physical attacks can still be designed to avoid detection by traditional QC methods, which suggests a strong need for new and more robust QC tools. Such new tools should be able to prevent, or at least minimize, the effects of cyber-physical attacks on production systems. Unfortunately, little to no research has been done on using QC tools for cyber-physical security of production systems. Hence, the overarching goal of this work is to allow QC systems to be designed and used effectively as a second line of defense, when traditional cyber-security techniques fail and the production system is already breached. To this end, this work focuses on: 1) understanding the role of QC systems in cyber-physical attacks within manufacturing through developing a taxonomy encompassing the different layers involved; 2) identifying existing weaknesses in QC tools and exploring the effects of exploiting them by cyber-physical attacks; and 3) proposing more effective QC tools that can overcome existing weaknesses by introducing randomness to the tools, for better security against cyber-physical attacks in manufacturing. / Ph. D. / The recent technological developments in computers and networking have made systems, such as production systems, more vulnerable to attacks having both cyber and physical components; i.e., to cyber-physical attacks. In manufacturing, such attacks are not only capable of stealing valuable information, but can also destroy equipment, force physical product changes, alter product’s mechanical characteristics, or endanger human lives. Typically, the manufacturing industry have relied on various Quality Control (QC) tools, such as product inspection, to detect the effects caused by these attacks. However, these attacks could be still designed in a way to avoid detection by traditional QC methods, which suggests a need for new and more effective QC tools. Such new tools should be able to prevent, or at least minimize, the effects of these attacks in manufacturing. Unfortunately, almost no research has been done on using QC tools for securing production systems against these malicious attacks. Hence, the overarching goal of this work is to allow QC systems to be designed in a more effective manner to act as a second line of defense, when traditional cyber-security measures and attackers have already accessed the production system. To this end, this work focuses on: 1) understanding the role of QC systems during the attack; 2) identifying existing weaknesses in QC tools and determining the effects of exploiting them by the attack; and 3) proposing more effective QC tools, for better protection against these types of cyber-physical attacks in manufacturing.
79

[en] CONTRIBUTIONS TO STATISTICAL CONTROL OF MULTIPLE STREAM PROCESSES / [pt] CONTRIBUIÇÕES PARA O CONTROLE ESTATÍSTICO DE PROCESSOS COM MÚLTIPLOS CANAIS

LAURA FRANCA MARQUES BARBOSA 11 July 2008 (has links)
[pt] Processos com diversos canais em paralelo são muito comuns na indústria; um exemplo são operações de enchimento, encontradas nas indústrias farmacêutica, alimentícia, cosmética e de bebidas. O método clássico para o controle estatístico desse tipo de processos, as group charts (Boyd, 1950), é pouco eficiente, por não considerar que uma parcela da variação nestes processos é comum a todos os canais. Mortell e Runger, em 1995, propuseram um esquema alternativo que leva este fato em conta. No ano seguinte, Runger, Alt e Montgomery propuseram um outro esquema. A presente dissertação propõe um terceiro esquema para o controle de tais processos. O seu modelo formal detalhado, as expressões para cálculo dos limites de controle e a análise de seu desempenho são contribuições originais. As probabilidades de sinal e o número esperado de amostras até a sinalização de alterações na média da parcela individual de um dos canais foram obtidas analiticamente e/ou por simulação, e utilizadas para comparação de desempenho com o esquema de Mortell e Runger. Os resultados demonstram a superioridade do esquema proposto para a detecção de variações superiores a um desvio-padrão na média da parcela individual de um canal do processo. Para detectar variações menores, nenhum dos dois esquemas é eficiente. O esquema de Runger et al. (1996) tem, para o caso de alteração em um canal apenas, desempenho igual ou inferior a ambos. Assim, o esquema aqui proposto revela-se o mais eficiente de todos. Uma série de extensões e questões em aberto para pesquisa futura são indicadas. / [en] Processes with several streams in parallel are very common in industry. Filling operations, such as the ones found in the pharmaceutical, cosmetics, or food and beverage industries are a typical example. The classical scheme for the statistical control of multiple-stream processes (MSP) is the group chart (Boyd, 1950). Its efficiency is impaired by its underlying model of the process not considering that part of the variation in MSP is common to all streams. In 1995, Mortell and Runger (M&R) proposed an alternative scheme which takes this fact into account. In the next year, Runger, Alt and Montgomery proposed another scheme. This dissertation proposes a third scheme for statistical control of MSP. The detailed mathematical model, the expressions for establishing the control limits, and the performance analysis here are original contributions. The probabilities of a signal and average run lengths in the case of shifts in the mean of one individual stream were obtained either analytically or by simulation and compared with the ones of M&R´s scheme. The results show the superiority of the proposed scheme for signaling shifts greater or equal to one standard deviation. For smaller shifts, neither scheme can be said to be really efficient. As to the scheme proposed by Runger et al. (1996), it is in some cases slower and in some cases just as fast as M&R`s, so the proposed scheme is the fastest of all. A number of extensions and open issues are indicated for future research.
80

Seleção de variáveis aplicada ao controle estatístico multivariado de processos em bateladas

Peres, Fernanda Araujo Pimentel January 2018 (has links)
A presente tese apresenta proposições para o uso da seleção de variáveis no aprimoramento do controle estatístico de processos multivariados (MSPC) em bateladas, a fim de contribuir com a melhoria da qualidade de processos industriais. Dessa forma, os objetivos desta tese são: (i) identificar as limitações encontradas pelos métodos MSPC no monitoramento de processos industriais; (ii) entender como métodos de seleção de variáveis são integrados para promover a melhoria do monitoramento de processos de elevada dimensionalidade; (iii) discutir sobre métodos para alinhamento e sincronização de bateladas aplicados a processos com diferentes durações; (iv) definir o método de alinhamento e sincronização mais adequado para o tratamento de dados de bateladas, visando aprimorar a construção do modelo de monitoramento na Fase I do controle estatístico de processo; (v) propor a seleção de variáveis, com propósito de classificação, prévia à construção das cartas de controle multivariadas (CCM) baseadas na análise de componentes principais (PCA) para monitorar um processo em bateladas; e (vi) validar o desempenho de detecção de falhas da carta de controle multivariada proposta em comparação às cartas tradicionais e baseadas em PCA. O desempenho do método proposto foi avaliado mediante aplicação em um estudo de caso com dados reais de um processo industrial alimentício. Os resultados obtidos demonstraram que a realização de uma seleção de variáveis prévia à construção das CCM contribuiu para reduzir eficientemente o número de variáveis a serem analisadas e superar as limitações encontradas na detecção de falhas quando bancos de elevada dimensionalidade são monitorados. Conclui-se que, ao possibilitar que CCM, amplamente utilizadas no meio industrial, sejam adequadas para banco de dados reais de elevada dimensionalidade, o método proposto agrega inovação à área de monitoramento de processos em bateladas e contribui para a geração de produtos de elevado padrão de qualidade. / This dissertation presents propositions for the use of variable selection in the improvement of multivariate statistical process control (MSPC) of batch processes, in order to contribute to the enhacement of industrial processes’ quality. There are six objectives: (i) identify MSPC limitations in industrial processes monitoring; (ii) understand how methods of variable selection are used to improve high dimensional processes monitoring; (iii) discuss about methods for alignment and synchronization of batches with different durations; (iv) define the most adequate alignment and synchronization method for batch data treatment, aiming to improve Phase I of process monitoring; (v) propose variable selection for classification prior to establishing multivariate control charts (MCC) based on principal component analysis (PCA) to monitor a batch process; and (vi) validate fault detection performance of the proposed MCC in comparison with traditional PCA-based and charts. The performance of the proposed method was evaluated in a case study using real data from an industrial food process. Results showed that performing variable selection prior to establishing MCC contributed to efficiently reduce the number of variables and overcome limitations found in fault detection when high dimensional datasets are monitored. We conclude that by improving control charts widely used in industry to accomodate high dimensional datasets the proposed method adds innovation to the area of batch process monitoring and contributes to the generation of high quality standard products.

Page generated in 0.0989 seconds