• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • Tagged with
  • 9
  • 9
  • 9
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The Role of Dominant Cause in Variation Reduction through Robust Parameter Design

Asilahijani, Hossein 24 April 2008 (has links)
Reducing variation in key product features is a very important goal in process improvement. Finding and trying to control the cause(s) of variation is one way to reduce variability, but is not cost effective or even possible in some situations. In such cases, Robust Parameter Design (RPD) is an alternative. The goal in RPD is to reduce variation by reducing the sensitivity of the process to the sources of variation, rather than controlling these sources directly. That is, the goal is to find levels of the control inputs that minimize the output variation imposed on the process via the noise variables (causes). In the literature, a variety of experimental plans have been proposed for RPD, including Robustness, Desensitization and Taguchi’s method. In this thesis, the efficiency of the alternative plans is compared in the situation where the most important source of variation, called the “Dominant Cause”, is known. It is shown that desensitization is the most appropriate approach for applying the RPD method to an existing process.
2

The Role of Dominant Cause in Variation Reduction through Robust Parameter Design

Asilahijani, Hossein 24 April 2008 (has links)
Reducing variation in key product features is a very important goal in process improvement. Finding and trying to control the cause(s) of variation is one way to reduce variability, but is not cost effective or even possible in some situations. In such cases, Robust Parameter Design (RPD) is an alternative. The goal in RPD is to reduce variation by reducing the sensitivity of the process to the sources of variation, rather than controlling these sources directly. That is, the goal is to find levels of the control inputs that minimize the output variation imposed on the process via the noise variables (causes). In the literature, a variety of experimental plans have been proposed for RPD, including Robustness, Desensitization and Taguchi’s method. In this thesis, the efficiency of the alternative plans is compared in the situation where the most important source of variation, called the “Dominant Cause”, is known. It is shown that desensitization is the most appropriate approach for applying the RPD method to an existing process.
3

Robust Parameter Design for Automatically Controlled Systems and Nanostructure Synthesis

Dasgupta, Tirthankar 25 June 2007 (has links)
This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.
4

CONFIDENCE REGIONS FOR OPTIMAL CONTROLLABLE VARIABLES FOR THE ROBUST PARAMETER DESIGN PROBLEM

Cheng, Aili January 2012 (has links)
In robust parameter design it is often possible to set the levels of the controllable factors to produce a zero gradient for the transmission of variability from the noise variables. If the number of control variables is greater than the number of noise variables, a continuum of zero-gradient solutions exists. This situation is useful as it provides the experimenter with multiple conditions under which to configure a zero gradient for noise variable transmission. However, this situation requires a confidence region for the multiple-solution factor levels that provides proper simultaneous coverage. This requirement has not been previously recognized in the literature. In the case where the number of control variables is greater than the number of noise variables, we show how to construct critical values needed to maintain the simultaneous coverage rate. Two examples are provided as a demonstration of the practical need to adjust the critical values for simultaneous coverage. The zero-gradient confidence region only focuses on the variance, and there are in fact many such situations in which focus is or could be placed entirely on the process variance. In the situation where both mean and variance need to be considered, a general confidence region in control variables is developed by minimizing weighted mean square error. This general method is applicable to many situations including mixture experiments which have an inherit constraint on the control factors. It also gives the user the flexibility to put different weights on the mean and variance parts for simultaneous optimization. It turns out that the same computational algorithm can be used to compute the dual confidence region in both control factors and the response variable. / Statistics
5

Semiparametric Techniques for Response Surface Methodology

Pickle, Stephanie M. 14 September 2006 (has links)
Many industrial statisticians employ the techniques of Response Surface Methodology (RSM) to study and optimize products and processes. A second-order Taylor series approximation is commonly utilized to model the data; however, parametric models are not always adequate. In these situations, any degree of model misspecification may result in serious bias of the estimated response. Nonparametric methods have been suggested as an alternative as they can capture structure in the data that a misspecified parametric model cannot. Yet nonparametric fits may be highly variable especially in small sample settings which are common in RSM. Therefore, semiparametric regression techniques are proposed for use in the RSM setting. These methods will be applied to an elementary RSM problem as well as the robust parameter design problem. / Ph. D.
6

Contributions to variable selection for mean modeling and variance modeling in computer experiments

Adiga, Nagesh 17 January 2012 (has links)
This thesis consists of two parts. The first part reviews a Variable Search, a variable selection procedure for mean modeling. The second part deals with variance modeling for robust parameter design in computer experiments. In the first chapter of my thesis, Variable Search (VS) technique developed by Shainin (1988) is reviewed. VS has received quite a bit of attention from experimenters in industry. It uses the experimenters' knowledge about the process, in terms of good and bad settings and their importance. In this technique, a few experiments are conducted first at the best and worst settings of the variables to ascertain that they are indeed different from each other. Experiments are then conducted sequentially in two stages, namely swapping and capping, to determine the significance of variables, one at a time. Finally after all the significant variables have been identified, the model is fit and the best settings are determined. The VS technique has not been analyzed thoroughly. In this report, we analyze each stage of the method mathematically. Each stage is formulated as a hypothesis test, and its performance expressed in terms of the model parameters. The performance of the VS technique is expressed as a function of the performances in each stage. Based on this, it is possible to compare its performance with the traditional techniques. The second and third chapters of my thesis deal with variance modeling for robust parameter design in computer experiments. Computer experiments based on engineering models might be used to explore process behavior if physical experiments (e.g. fabrication of nanoparticles) are costly or time consuming. Robust parameter design (RPD) is a key technique to improve process repeatability. Absence of replicates in computer experiments (e.g. Space Filling Design (SFD)) is a challenge in locating RPD solution. Recently, there have been studies (e.g. Bates et al. (2005), Chen et al. (2006), Dellino et al. (2010 and 2011), Giovagnoli and Romano (2008)) of RPD issues on computer experiments. Transmitted variance model (TVM) proposed by Shoemaker and Tsui. (1993) for physical experiments can be applied in computer simulations. The approaches stated above rely heavily on the estimated mean model because they obtain expressions for variance directly from mean models or by using them for generating replicates. Variance modeling based on some kind of replicates relies on the estimated mean model to a lesser extent. To the best of our knowledge, there is no rigorous research on variance modeling needed for RPD in computer experiments. We develop procedures for identifying variance models. First, we explore procedures to decide groups of pseudo replicates for variance modeling. A formal variance change-point procedure is developed to rigorously determine the replicate groups. Next, variance model is identified and estimated through a three-step variable selection procedure. Properties of the proposed method are investigated under various conditions through analytical and empirical studies. In particular, impact of correlated response on the performance is discussed.
7

Resource Modeling and Allocation in Competitive Systems

An, Na 05 April 2005 (has links)
This thesis includes three self-contained projects: In the first project Bidding strategies and their impact on the auctioneer's revenue in combinatorial auctions, focusing on combinatorial auctions, we propose a simple and efficient model for evaluating the value of any bundle given limited information, design bidding strategies that efficiently select desirable bundles, and evaluate the performance of different bundling strategies under various market settings. In the second project Retailer shelf-space management with promotion effects, promotional investment effects are integrated with retail store assortment decisions and shelf space allocation. An optimization model for the category shelf-space allocation incorporating promotion effects is presented. Based on the proposed model, a category shelf space allocation framework with trade allowances is presented where a multi-player Retailer Stackelberg game is introduced to model the interactions between retailer and manufacturers. In the third project Supply-chain oriented robust parameter design, we introduce the game theoretical method, commonly used in supply-chain analysis to solve potential conflicts between manufacturers at various stages. These manufacturing chain partners collaboratively decide parameter design settings of the controllable factors to make the product less sensitive to process variations.
8

Mixture-process Variable Design Experiments with Control and Noise Variables Within a Split-plot Structure

January 2010 (has links)
abstract: In mixture-process variable experiments, it is common that the number of runs is greater than in mixture-only or process-variable experiments. These experiments have to estimate the parameters from the mixture components, process variables, and interactions of both variables. In some of these experiments there are variables that are hard to change or cannot be controlled under normal operating conditions. These situations often prohibit a complete randomization for the experimental runs due to practical and economical considerations. Furthermore, the process variables can be categorized into two types: variables that are controllable and directly affect the response, and variables that are uncontrollable and primarily affect the variability of the response. These uncontrollable variables are called noise factors and assumed controllable in a laboratory environment for the purpose of conducting experiments. The model containing both noise variables and control factors can be used to determine factor settings for the control factor that makes the response "robust" to the variability transmitted from the noise factors. These types of experiments can be analyzed in a model for the mean response and a model for the slope of the response within a split-plot structure. When considering the experimental designs, low prediction variances for the mean and slope model are desirable. The methods for the mixture-process variable designs with noise variables considering a restricted randomization are demonstrated and some mixture-process variable designs that are robust to the coefficients of interaction with noise variables are evaluated using fraction design space plots with the respect to the prediction variance properties. Finally, the G-optimal design that minimizes the maximum prediction variance over the entire design region is created using a genetic algorithm. / Dissertation/Thesis / Ph.D. Industrial Engineering 2010
9

Contributions to quality improvement methodologies and computer experiments

Tan, Matthias H. Y. 16 September 2013 (has links)
This dissertation presents novel methodologies for five problem areas in modern quality improvement and computer experiments, i.e., selective assembly, robust design with computer experiments, multivariate quality control, model selection for split plot experiments, and construction of minimax designs. Selective assembly has traditionally been used to achieve tight specifications on the clearance of two mating parts. Chapter 1 proposes generalizations of the selective assembly method to assemblies with any number of components and any assembly response function, called generalized selective assembly (GSA). Two variants of GSA are considered: direct selective assembly (DSA) and fixed bin selective assembly (FBSA). In DSA and FBSA, the problem of matching a batch of N components of each type to give N assemblies that minimize quality cost is formulated as axial multi-index assignment and transportation problems respectively. Realistic examples are given to show that GSA can significantly improve the quality of assemblies. Chapter 2 proposes methods for robust design optimization with time consuming computer simulations. Gaussian process models are widely employed for modeling responses as a function of control and noise factors in computer experiments. In these experiments, robust design optimization is often based on average quadratic loss computed as if the posterior mean were the true response function, which can give misleading results. We propose optimization criteria derived by taking expectation of the average quadratic loss with respect to the posterior predictive process, and methods based on the Lugannani-Rice saddlepoint approximation for constructing accurate credible intervals for the average loss. These quantities allow response surface uncertainty to be taken into account in the optimization process. Chapter 3 proposes a Bayesian method for identifying mean shifts in multivariate normally distributed quality characteristics. Multivariate quality characteristics are often monitored using a few summary statistics. However, to determine the causes of an out-of-control signal, information about which means shifted and the directions of the shifts is often needed. We propose a Bayesian approach that gives this information. For each mean, an indicator variable that indicates whether the mean shifted upwards, shifted downwards, or remained unchanged is introduced. Default prior distributions are proposed. Mean shift identification is based on the modes of the posterior distributions of the indicators, which are determined via Gibbs sampling. Chapter 4 proposes a Bayesian method for model selection in fractionated split plot experiments. We employ a Bayesian hierarchical model that takes into account the split plot error structure. Expressions for computing the posterior model probability and other important posterior quantities that require evaluation of at most two uni-dimensional integrals are derived. A novel algorithm called combined global and local search is proposed to find models with high posterior probabilities and to estimate posterior model probabilities. The proposed method is illustrated with the analysis of three real robust design experiments. Simulation studies demonstrate that the method has good performance. The problem of choosing a design that is representative of a finite candidate set is an important problem in computer experiments. The minimax criterion measures the degree of representativeness because it is the maximum distance of a candidate point to the design. Chapter 5 proposes algorithms for finding minimax designs for finite design regions. We establish the relationship between minimax designs and the classical set covering location problem in operations research, which is a binary linear program. We prove that the set of minimax distances is the set of discontinuities of the function that maps the covering radius to the optimal objective function value, and optimal solutions at the discontinuities are minimax designs. These results are employed to design efficient procedures for finding globally optimal minimax and near-minimax designs.

Page generated in 0.0898 seconds