• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 18
  • 18
  • 11
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Parameters design and the operation simulation of a pneumatic dispensing system for biomaterial 3D printing

Zhou, Wenqi 19 September 2016 (has links)
Tissue engineering (TE) combines methods of cells, engineering and materials to improve or replace biological functions of native tissues or organs. Fabricating scaffolds is a vital process in TE for the mechanical support of the cells proliferation with desired functions and intricate structures. A pneumatic dispensing system of 3D printing is used to build soft scaffolds with controllable pore sizes in this research. An effective method is required to help users to systematically select proper parameters to print hydrogel strands with desired widths to fabricate scaffolds. In this research, printing parameters are classified first to build a simplified mathematical model to identify the significant parameters. A factorial experiment is then conducted to investigate effects of selected parameters and their interactions on the strand width. The solution is further verified using single variable experiments with the regression test. Based on the results, a parameters selection method is proposed and evaluated using two verification tests. A comparison test of the scaffolds fabrication is conducted to verify the analytic solution of the proposed theory. It is found that the nozzle sizes, dispensing pressure, and moving speed of a printer head all statistically affect strand widths. Among them, the nozzle size has the most significant influence on strand widths. Factors interactions are mainly embodied in between the nozzle size - moving speed and the nozzle size - dispensing pressure. In addition, a statistical significant linear relationship is found between the moving speed - strand width and the dispensing pressure - strand width. Furthermore, due to the high cost of bio-materials and the high pressure threat of air compressor in the dispensing system, a 3D bio-printing simulation system is developed to demonstrate the system configuration and operation procedures to help new users avoiding operation mistakes in the real world. A haptic-based 3D bio-printing simulation system with the haptic feedback is presented by means of the Phantom Omni haptic interface. The virtual environment is developed using the Worldviz software. The haptic force feedback is calculated based on the spring-damper model and the proxy method. This system is verified using questionnaire survey to provide a flexible, cost-effective, safe, and highly interactive learning environment. / February 2016
2

The Role of Dominant Cause in Variation Reduction through Robust Parameter Design

Asilahijani, Hossein 24 April 2008 (has links)
Reducing variation in key product features is a very important goal in process improvement. Finding and trying to control the cause(s) of variation is one way to reduce variability, but is not cost effective or even possible in some situations. In such cases, Robust Parameter Design (RPD) is an alternative. The goal in RPD is to reduce variation by reducing the sensitivity of the process to the sources of variation, rather than controlling these sources directly. That is, the goal is to find levels of the control inputs that minimize the output variation imposed on the process via the noise variables (causes). In the literature, a variety of experimental plans have been proposed for RPD, including Robustness, Desensitization and Taguchi’s method. In this thesis, the efficiency of the alternative plans is compared in the situation where the most important source of variation, called the “Dominant Cause”, is known. It is shown that desensitization is the most appropriate approach for applying the RPD method to an existing process.
3

The Role of Dominant Cause in Variation Reduction through Robust Parameter Design

Asilahijani, Hossein 24 April 2008 (has links)
Reducing variation in key product features is a very important goal in process improvement. Finding and trying to control the cause(s) of variation is one way to reduce variability, but is not cost effective or even possible in some situations. In such cases, Robust Parameter Design (RPD) is an alternative. The goal in RPD is to reduce variation by reducing the sensitivity of the process to the sources of variation, rather than controlling these sources directly. That is, the goal is to find levels of the control inputs that minimize the output variation imposed on the process via the noise variables (causes). In the literature, a variety of experimental plans have been proposed for RPD, including Robustness, Desensitization and Taguchi’s method. In this thesis, the efficiency of the alternative plans is compared in the situation where the most important source of variation, called the “Dominant Cause”, is known. It is shown that desensitization is the most appropriate approach for applying the RPD method to an existing process.
4

Robust Parameter Design for Automatically Controlled Systems and Nanostructure Synthesis

Dasgupta, Tirthankar 25 June 2007 (has links)
This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.
5

A Study of Process Parameter Optimization for BIC Steel

Tsai, Jeh-Hsin 06 February 2006 (has links)
Taguchi methods is also called quality engineering. It is a systematic methodology for product design(modify) and process design(improvement) with the most of saving cost and time, in order to satisfy customer requirement. Taguchi¡¦s parameter design is also known as robust design, which has the merits of low cost and high efficiency, and can achieve the activities of product quality design, management and improvement, consequently to reinforce the competitive ability of business. It is a worthy research course to study how to effectively apply parameter design, to shorten time spending on research, early to promote product having low cost and high quality on sale and to reinforce competitive advantage. However, the parameter design optimization problems are difficult in practical application owing to (1)complex and nonlinear relationships exist among the system¡¦s inputs, outputs and parameters and (2)interactions may occur among parameters. (3)In Taguchi¡¦s two-phase optimization procedure, the adjustment factor cannot be guaranteed to exist in practice. (4)For some reasons, the data may become lost or were never available. For these incomplete data, the Taguchi¡¦s method cannot treat them well. Neural networks have learning capacity fault tolerance and model-free characteristics. These characteristics support the neural networks as a competitive tool in processing multivariable input-output implementation. The successful field including diagnostics, robotics, scheduling, decision-marking, predicition, etc. In the process of searching optimization, genetic algorithm can avoid local optimization. So that it may enhance the possibility of global optimization. This study had drawn out the key parameters from the spheroidizing theory, and L18, L9 orthogonal experimental array were applied to determine the optimal operation parameters by Signal/Noise analysis. The conclusions are summarized as follows: 1. The spheroidizing of AISI 3130 used to be the highest unqualified product, and required for the second annealing treatment. The operational record before improvement showed 83 tons of the 3130 steel were required for the second treatment. The optimal operation parameters had been defined by L18(61¡Ñ35) orthogonal experimental array. The control parameters of the annealing temperature was at B2
6

A design of experiment approach to tolerance allocation

Islam, Ziaul January 1995 (has links)
No description available.
7

Advanced methods for finite element simulation for part and process design in tube hydroforming

Jirathearanat, Suwat 03 February 2004 (has links)
No description available.
8

CONFIDENCE REGIONS FOR OPTIMAL CONTROLLABLE VARIABLES FOR THE ROBUST PARAMETER DESIGN PROBLEM

Cheng, Aili January 2012 (has links)
In robust parameter design it is often possible to set the levels of the controllable factors to produce a zero gradient for the transmission of variability from the noise variables. If the number of control variables is greater than the number of noise variables, a continuum of zero-gradient solutions exists. This situation is useful as it provides the experimenter with multiple conditions under which to configure a zero gradient for noise variable transmission. However, this situation requires a confidence region for the multiple-solution factor levels that provides proper simultaneous coverage. This requirement has not been previously recognized in the literature. In the case where the number of control variables is greater than the number of noise variables, we show how to construct critical values needed to maintain the simultaneous coverage rate. Two examples are provided as a demonstration of the practical need to adjust the critical values for simultaneous coverage. The zero-gradient confidence region only focuses on the variance, and there are in fact many such situations in which focus is or could be placed entirely on the process variance. In the situation where both mean and variance need to be considered, a general confidence region in control variables is developed by minimizing weighted mean square error. This general method is applicable to many situations including mixture experiments which have an inherit constraint on the control factors. It also gives the user the flexibility to put different weights on the mean and variance parts for simultaneous optimization. It turns out that the same computational algorithm can be used to compute the dual confidence region in both control factors and the response variable. / Statistics
9

Semiparametric Techniques for Response Surface Methodology

Pickle, Stephanie M. 14 September 2006 (has links)
Many industrial statisticians employ the techniques of Response Surface Methodology (RSM) to study and optimize products and processes. A second-order Taylor series approximation is commonly utilized to model the data; however, parametric models are not always adequate. In these situations, any degree of model misspecification may result in serious bias of the estimated response. Nonparametric methods have been suggested as an alternative as they can capture structure in the data that a misspecified parametric model cannot. Yet nonparametric fits may be highly variable especially in small sample settings which are common in RSM. Therefore, semiparametric regression techniques are proposed for use in the RSM setting. These methods will be applied to an elementary RSM problem as well as the robust parameter design problem. / Ph. D.
10

Contributions to variable selection for mean modeling and variance modeling in computer experiments

Adiga, Nagesh 17 January 2012 (has links)
This thesis consists of two parts. The first part reviews a Variable Search, a variable selection procedure for mean modeling. The second part deals with variance modeling for robust parameter design in computer experiments. In the first chapter of my thesis, Variable Search (VS) technique developed by Shainin (1988) is reviewed. VS has received quite a bit of attention from experimenters in industry. It uses the experimenters' knowledge about the process, in terms of good and bad settings and their importance. In this technique, a few experiments are conducted first at the best and worst settings of the variables to ascertain that they are indeed different from each other. Experiments are then conducted sequentially in two stages, namely swapping and capping, to determine the significance of variables, one at a time. Finally after all the significant variables have been identified, the model is fit and the best settings are determined. The VS technique has not been analyzed thoroughly. In this report, we analyze each stage of the method mathematically. Each stage is formulated as a hypothesis test, and its performance expressed in terms of the model parameters. The performance of the VS technique is expressed as a function of the performances in each stage. Based on this, it is possible to compare its performance with the traditional techniques. The second and third chapters of my thesis deal with variance modeling for robust parameter design in computer experiments. Computer experiments based on engineering models might be used to explore process behavior if physical experiments (e.g. fabrication of nanoparticles) are costly or time consuming. Robust parameter design (RPD) is a key technique to improve process repeatability. Absence of replicates in computer experiments (e.g. Space Filling Design (SFD)) is a challenge in locating RPD solution. Recently, there have been studies (e.g. Bates et al. (2005), Chen et al. (2006), Dellino et al. (2010 and 2011), Giovagnoli and Romano (2008)) of RPD issues on computer experiments. Transmitted variance model (TVM) proposed by Shoemaker and Tsui. (1993) for physical experiments can be applied in computer simulations. The approaches stated above rely heavily on the estimated mean model because they obtain expressions for variance directly from mean models or by using them for generating replicates. Variance modeling based on some kind of replicates relies on the estimated mean model to a lesser extent. To the best of our knowledge, there is no rigorous research on variance modeling needed for RPD in computer experiments. We develop procedures for identifying variance models. First, we explore procedures to decide groups of pseudo replicates for variance modeling. A formal variance change-point procedure is developed to rigorously determine the replicate groups. Next, variance model is identified and estimated through a three-step variable selection procedure. Properties of the proposed method are investigated under various conditions through analytical and empirical studies. In particular, impact of correlated response on the performance is discussed.

Page generated in 0.0964 seconds