• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 806
  • 687
  • 106
  • 64
  • 41
  • 40
  • 35
  • 26
  • 11
  • 8
  • 6
  • 5
  • 5
  • 5
  • 5
  • Tagged with
  • 2226
  • 2226
  • 660
  • 658
  • 369
  • 203
  • 188
  • 185
  • 177
  • 163
  • 156
  • 148
  • 122
  • 121
  • 120
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

A study of quality improvement tools and total quality management (TQM) frameworks for small and medium size manufacturing companies

Jamil, Kabir January 1998 (has links)
No description available.
562

An interactive computer model for economical quality control applicable to a wide range of process distributions

Barlas, Yaman January 1980 (has links)
No description available.
563

Cellular Requirements for Phenylalanyl-tRNA Synthetase Quality Control

Reynolds, Noah Martin Wiersma 19 October 2011 (has links)
No description available.
564

A contingency approach to service reliability and service customization : their relationship and role in customer evaluations

Gupta, Kunal January 2003 (has links)
No description available.
565

Bayesian Optimization for Engineering Design and Quality Control of Manufacturing Systems

AlBahar, Areej Ahmad 14 April 2022 (has links)
Manufacturing systems are usually nonlinear, nonstationary, highly corrupted with outliers, and oftentimes constrained by physical laws. Modeling and approximation of their underly- ing response surface functions are extremely challenging. Bayesian optimization is a great statistical tool, based on Bayes rule, used to optimize and model these expensive-to-evaluate functions. Bayesian optimization comprises of two important components namely, a sur- rogate model often the Gaussian process and an acquisition function often the expected improvement. The Gaussian process, known for its outstanding modeling and uncertainty quantification capabilities, is used to represent the underlying response surface function, while the expected improvement is used to select the next point to be evaluated by trading- off exploitation and exploration. Although Bayesian optimization has been extensively used in optimizing unknown and expensive-to-evaluate functions and in hyperparameter tuning of deep learning models, mod- eling highly outlier-corrupted, nonstationary, and stress-induced response surface functions hinder the use of conventional Bayesian optimization models in manufacturing systems. To overcome these limitations, we propose a series of systematic methodologies to improve Bayesian optimization for engineering design and quality control of manufacturing systems. Specifically, the contributions of this dissertation can be summarized as follows. 1. A novel asymmetric robust kernel function, called AEN-RBF, is proposed to model highly outlier-corrupted functions. Two new hyperparameters are introduced to im- prove the flexibility and robustness of the Gaussian process model. 2. A nonstationary surrogate model that utilizes deep multi-layer Gaussian processes, called MGP-CBO, is developed to improve the modeling of complex anisotropic con- strained nonstationary functions. 3. A Stress-Aware Optimal Actuator Placement framework is designed to model and op- timize stress-induced nonlinear constrained functions. Through extensive evaluations, the proposed methodologies have shown outstanding and significant improvements when compared to state-of-the-art models. Although these pro- posed methodologies have been applied to certain manufacturing systems, they can be easily adapted to other broad ranges of problems. / Doctor of Philosophy / Modeling advanced manufacturing systems, such as engineering design and quality moni- toring and control, is extremely challenging. The underlying response surface functions of these manufacturing systems are often nonlinear, nonstationary, and expensive-to-evaluate. Bayesian optimization, a statistical modeling approach based on Bayes rule, is used to rep- resent and model those complex (i.e., black-box) objective functions. A Bayesian optimiza- tion model consists of a surrogate model, often the Gaussian process, and an acquisition function, often the expected improvement. Conventional Bayesian optimization models do not accurately represent non-stationary and outlier-corrupted functions. To overcome these limitations, we propose a new asymmetric robust kernel function to improve the model- ing capabilities of the Gaussian process model in process quality control through improved defect detection and classification. We also propose a non-stationary surrogate model to improve the performance of Bayesian optimization in aerospace process design problems. Finally, we develop a new optimization framework that models and optimizes stress-induced constrained aerospace manufacturing systems correctly. Our extensive experiments show significant improvements of these three proposed models when compared to state-of-the-art methodologies.
566

The application of statistical quality control to the centrifugal casting of iron pipe

Whaley, Paul Arthur January 1947 (has links)
M.S.
567

A general purpose machine vision prototyper for investigating the inspection of planar webs

Ng, Chong Teck 24 October 2005 (has links)
In order for an industrial inspection system to be of utility in manufacturing it must be fast, accurate, and flexible [Chin 1986]. Current machine vision systems are very specialized and inflexible in nature. A reason for the inflexibility of current machine vision systems is the need for real-time processing of image data. Such a need has forced both the use of very specialized image processing hardware as well as the use of rather simple, very specialized computer vision algorithms to do the analysis. On the other hand, most, if not all, of today’s computer vision methods are not general purpose in nature. In the absence of truly robust general purpose methods, developing satisfactory machine vision solutions will continue to involve experimenting with machine vision hardware and software components. Given the current state of machine vision technology, it would seem that the best method for creating flexible machine vision systems is, perhaps, to define a subclass of inspection problems where all the problems within the subclass have a number of common features about them. Such a subclass must be of interest to a number of manufacturers. It must also be “reasonable” to solve, given the current state of the art. Once the subclass has been selected, the next logical step would seem to be to create a device that makes performing all the needed experiments on the various problems within the class easy to perform. Based on the above line of reasoning, this work has four major objectives. The first objective is to define a meaningful subclass of inspection problems that are a) of interest to a number of manufacturers, and b) represent inspection tasks that seem “reasonable” within the current state-of-the-art of computer vision. The subclass of inspection problems selected for this work is the longitudinal planar web inspection problem under the two-dimensional imaging restriction. The second objective of this work is to create a vehicle that will allow the types of experimentation usually associated with the development of machine vision systems to be facilitated. This vehicle created is called a “machine vision prototyper.” The third objective of this work is to use the machine vision prototyper system to attack a particular planar web applications problem. The application considered is the problem of locating and identifying surface defects in surfaced hardwood lumber in a species independent manner. The fourth objective of this research is to indicate how the prototyper system can be used to attack a second planar web application problem. This application problem is the inspection of hardwood parts coming out of a molder. The utility of the machine vision prototyper system as an experimental tool is demonstrated on two of the three possible types of longitudinal planar web inspection problems. The results include the development of a machine vision system for a hardwood surfaced lumber surface feature detection problem, and a discussion of how the prototyper can be used to attack the problem of inspecting hardwood parts coming out of a molder. / Ph. D.
568

A Performance Analysis of the Minimax Multivariate Quality Control Chart

Rehmert, Ian Jon 18 December 1997 (has links)
A performance analysis of three different Minimax control charts is performed with respect to their Chi-Square control chart counterparts under several different conditions. A unique control chart must be constructed for each process described by a unique combination of quality characteristic mean vector and associated covariance matrix. The three different charts under consideration differ in the number of quality characteristic variables of concern. In each case, without loss of generality the in-control quality characteristic mean vector is assumed to have zero entries and the associated covariance matrix is assumed to have non-negative entries. The performance of the Chi-Square and Minimax charts are compared under different values of the sample size, the probability of a Type I error, and selected shifts in the quality characteristic mean vector. Minimax and Chi-Square charts that are compared share identical in-control average run lengths (ARL) making the out-of-control ARL the appropriate performance measure. A combined Tausworthe pseudorandom number generator is used to generate the out-of-control mean vectors. Issues regarding multivariate uniform pseudorandom number generation are addressed. / Master of Science
569

Developing a quality improvement taxonomy

Pang, Eva Y. 12 March 2009 (has links)
Total Quality Management (TQM) has become a popular term in quality improvement management. Many organizations, however, frequently implement quality tools that are not well coordinated with the established quality principles or the managerial decisions and actions. This research focuses on studying three major quality management components: quality philosophies, interventions, and tools. The primary desired outcome of this research is to improve the understanding of TQM implementation. The means to accomplish this desired outcome included reviewing quality improvement philosophies of Deming, Juran, Feigenbaum, Crosby, and Ishikawa, and conducting multiple case studies. The eight organizations vary in their organizational type (service or manufacturing), years of experience in their TQM efforts, and their sizes (number of employees). The case studies involved interviewing the quality managers and described how they define and implement TQM. A quality improvement taxonomy, a two-dimensional matrix, is a product developed as a result of this study. The first dimension of the taxonomy describes the quality interventions--the organizational planned changes for improving quality, which are categorized by six quality checkpoints: management of upstream systems, incoming quality assurance, in-process quality management, outgoing quality assurance, proactive assurance of customer satisfaction, and the overall quality management process. The second dimension lists seventeen supporting quality management tools. They include tools such as the Input/Output Analysis, Quality Function Deployment, Competitive Benchmarking, and Statistical Process Control. Organizations can use this quality improvement taxonomy to communicate the TQM concept and to improve coordination of quality management tools with the overall TQM implementation decisions and actions. / Master of Science
570

A comparison of alternative methods to the shewhart-type control chart

Hall, Deborah A. 08 September 2012 (has links)
A control chart that simultaneously tracks the mean and variance of a normally distributed variable with no compensation effect is defined in this work. This joint control chart is compared to five other charts: an Χ chart, an s² chart, a Reynolds and Ghosh chart, a Repko process capability plot, and a t-statistic chart. The criterion for comparison is the probability of a Type II sampling error. Several out-of-control cases are examined. In the case of Repko, an equation is defined to compute the Type II error probability. The results indicate that the Reynolds and Ghosh statistic is powerful for cases when the variance shifts out of control. The Χ chart is powerful when the mean shifts with moderate changes in the variance. The joint chart is powerful for moderate changes in the mean and variance. / Master of Science

Page generated in 0.1697 seconds