• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • 11
  • 7
  • 2
  • 2
  • Tagged with
  • 65
  • 65
  • 46
  • 20
  • 16
  • 16
  • 15
  • 11
  • 11
  • 11
  • 9
  • 9
  • 8
  • 7
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Simulation-optimization studies: under efficient stimulationstrategies, and a novel response surface methodology algorithm

Joshi, Shirish 06 June 2008 (has links)
While attempting to solve optimization problems, the lack of an explicit mathematical expression of the problem may preclude the application of the standard methods of optimization which prove valuable in an analytical framework. In such situations, computer simulations are used to obtain the mean response values for the required settings of the independent variables. Procedures for optimizing on the mean response values, which are in turn obtained through computer simulation experiments, are called simulation-optimization techniques. The focus of this work is on the simulation-optimization technique of response surface methodology (RSM). RSM is a collection of mathematical and statistical techniques for experimental optimization. Correlation induction strategies can be employed in RSM to achieve improved statistical inferences on experimental designs and sequential experimentations. Also, the search procedures currently employed by RSM algorithms can be improved by incorporating gradient deflection methods. This dissertation has three major goals: (a) develop analytical results to quantitatively express the gains of using the common random number (CRN) strategy of variance reduction over direct simulation (independent streams or IS strategy) at each stage RSM, (b) develop a new RSM algorithm by incorporating gradient deflection methods in existing RSM algorithms, and (c) to conduct extensive empirical studies to quantify: (i) the use of eRN strategy over direct simulation in a standard RSM algorithm, and (ii) the gains of the new RSM algorithm over a standard existing RSM algorithm. / Ph. D.
42

Robust parameter optimization strategies in computer simulation experiments

Panis, Renato P. 06 June 2008 (has links)
An important consideration in computer simulation studies is the issue of model validity, the level of accuracy with which the simulation model represents the real world system under study. This dissertation addresses a major cause of model validity problems: the dissimilarity between the simulation model and the real system due to the dynamic nature of the real system that results from the presence of nonstationary stochastic processes within the system. This transitory characteristic of the system is typically not addressed in the search for an optimal solution. In reliability and quality control studies, it is known that optimizing with respect to the variance of the response is as important a concern as optimizing with respect to average performance response. Genichi Taguchi has been instrumental in the advancement of this philosophy. His work has resulted in what is now popularly known as the Taguchi Methods for robust parameter design. Following Taguchi's philosophy, the goal of this research is to devise a framework for finding optimum operating levels for the controllable input factors in a stochastic system that are insensitive to internal sources of variation. Specifically, the model validity problem of nonstationary system behavior is viewed as a major internal cause of system variation. In this research the typical application of response surface methodology (RSM) to the problem of simulation optimization is examined. Simplifying assumptions that enable the use of RSM techniques are examined. The relaxation of these assumptions to address model validity leads to a modification of the RSM approach to properly handle the problem of optimization in the presence of nonstationarity. Taguchi's strategy and methods are then adapted and applied to this problem. Finally, dual-response RSM extensions of the Taguchi approach separately modeling the process performance mean and variance are considered and suitably revised to address the same problem. A second cause of model validity problems is also considered: the random behavior of the supposedly controllable input factors to the system. A resolution to this source of model invalidity is proposed based on the methodology described above. / Ph. D.
43

Outliers and robust response surface designs

O'Gorman, Mary Ann January 1984 (has links)
A commonly occurring problem in response surface methodology is that of inconsistencies in the response variable. These inconsistencies, or maverick observations, are referred to here as outliers. Many models exist for describing these outliers. Two of these models, the mean shift and the variance inflation outlier models, are employed in this research. Several criteria are developed for determining when the outlying observation is detrimental to the analysis. These criteria all lead to the same condition which is used to develop statistical tests of the null hypothesis that the outlier is not detrimental to the analysis. These results are extended to the multiple outlier case for both models. The robustness of response surface designs is also investigated. Robustness to outliers, missing data and errors in control are examined for first order models. The orthogonal designs with large second moments, such as the 2ᵏ factorial designs, are optimal in all three cases. In the second order case, robustness to outliers and to missing data are examined. Optimal design parameters are obtained by computer for the central composite, Box-Behnken, hybrid, small composite and equiradial designs. Similar results are seen for both robustness to outliers and to missing data. The central composite turns out to be the optimal design type and of the two economical design types the small composite is preferred to the hybrid. / Ph. D.
44

Effective design augmentation for prediction

Rozum, Michael A. 03 August 2007 (has links)
In a typical response surface study, an experimenter will fit a first order model in the early stages of the study and obtain the path of steepest ascent. The path leads the experimenter out of this initial region of interest and into a new region of interest. The experimenter may fit another first order model here or, if curvature is believed to be present in the underlying system, a second order model. In the final stages of the study, the experimenter fits a second order model and typically contracts the region of interest as the levels of the factors that optimize the response are nearly determined. Due to the sequential nature of experimentation in a typical response surface study, the experimenter may find himself/herself wanting to augment some initial design with additional runs within the current region of interest. The little discussion that exists in the statistical literature suggests adding runs sequentially in a conditional D-optimal manner. Four prediction oriented criteria, I<sub>IV</sub>, I<sub>SV</sub><sub>r</sub>, I<sub>SV</sub><sub>r</sub><sup>ADJ</sup> and G, and two estimation oriented criteria, A and E, are studied here as other possible sequential design augmentation optimality criteria. Analytical properties of I<sub>IV</sub>, I<sub>SV</sub><sub>r</sub>, and A are developed within the context of the design augmentation problem. I<sub>SV</sub><sub>r</sub> is found to be somewhat ineffective in actual sequential design augmentation situations. A new more effective criterion,I<sub>SV</sub><sub>r</sub><sup>ADJ</sup> is introduced and thoroughly developed. Software is developed which allows sequential design augmentation via these seven criteria. Unlike existing design augmentation software, all locations within the current region of interest are eligible for inclusion in the augmenting design (a continuous candidate list). Case studies were performed. For a first order model there was negligible difference in the prediction variance properties of the designs generated via sequential augmentation by D and the A best of the other criteria, I<sub>IV</sub>, I<sub>SV</sub><sub>r</sub><sup>ADJ</sup>, and A. For a second order model, however, the designs generated via sequential augmentation by D place too few runs too late in the interior of the region of interest. Thus, designs generated via sequential augmentation by D yield inferior prediction variance properties to the designs generated via I<sub>IV</sub>, I<sub>SV</sub><sub>r</sub><sup>ADJ</sup>, and A. The D-efficiencies of the designs generated via sequential augmentation by I<sub>IV</sub>, I<sub>SV</sub><sub>r</sub><sup>ADJ</sup>, and A range from the reasonable to fully D-optimum. Therefore, the I<sub>IV</sub>, I<sub>SV</sub><sub>r</sub><sup>ADJ</sup>, optimality criteria are recommended for sequential design augmentation when quality of prediction is more important than quality in estimation of coefficients. / Ph. D.
45

Parâmetros mais influentes na previsão da diluição inicial em sistemas de emissários submarinos: uma contribuição baseada em técnicas de planejamento de experimentos e superfícies de resposta. / More influential parameters in the prediction of initial dilution in submarine outfall systems: a contribution based on techniques of design of experiment and response surfaces.

Yanes, Jacqueline Pedrera 14 December 2017 (has links)
As simulações computacionais são comumente empregadas no estudo do processo da dispersão da pluma de efluente de esgoto no mar como suporte no projeto de sistemas de emissários submarinos. Os modelos de campo próximo são uma ferramenta eficiente para prever o comportamento desses sistemas, através da previsão da diluição inicial, das principais dimensões da pluma e da concentração final de poluentes, informações úteis para o projeto do difusor. Usando modelos de campo próximo, como o NRFIELD e o UM3 do Visual Plumes, é possível simular diferentes alternativas de emissários. No entanto, a realização de muitas simulações resulta numa grande quantidade de informações que impede a obtenção de conclusões objetivas, consumindo muito tempo do projeto. Uma estratégia para um processo de simulação eficiente que permita a obtenção da máxima informação possível com um número mínimo de simulações é fundamental. Na análise dos resultados de simulações nas quais seus parâmetros de entrada são modificados de forma controlada, as causas das mudanças na resposta podem ser facilmente identificadas. Devido a isto, o uso da análise estatística foi considerado. As ferramentas estatísticas podem ser úteis na organização das simulações e na análise subsequente dos resultados. O Planejamento de Experimentos (DOE) é a metodologia estatística para uma experimentação eficiente, permitindo a identificação das variáveis estatisticamente significativas em qualquer sistema ou processo e, a Metodologia de Superfície de Resposta permite a otimização de sistemas a partir de suas variáveis estatisticamente significativas. Considerando isso, o objetivo principal desta pesquisa foi apresentar uma nova abordagem para otimizar, por meio da simulação computacional com modelos de campo próximo e da utilização das ferramentas estatísticas: Planejamento de Experimentos e Superfícies de Resposta, a quantidade de simulações necessárias para a obtenção do valor de diluição inicial mais próximo do ótimo, no projeto de emissários submarinos, através da identificação das variáveis estatisticamente significativas, na predição da diluição inicial, nos modelos de campo próximo do Visual Plumes. Os resultados do trabalho mostraram que só três variáveis geométricas, da configuração do difusor de um emissário, são estatisticamente significativas para a previsão da diluição inicial, e que apenas umas poucas simulações, estatisticamente planejadas, são indispensáveis para a obtenção de uma configuração de emissário mais eficiente. / The computational simulations are commonly employed in the study of the process of dispersion of sewage plume at sea as support in the design of submarine outfall systems. The near field models are an efficient tool to predict the behavior of these systems, by predicting the initial dilution, the main dimensions of the sewage plume and the final concentration of pollutants, useful information for the design of the diffuser. Using near field models such as NRFIELD and UM3 from Visual Plumes, it is possible to simulate different submarine outfall alternatives for their posterior comparison and analysis. However, when many simulations are conducted, it is resulted a large amount of information, which make more difficult to obtain objective conclusions and it is much more time consuming solution. A strategy for an efficient simulation process, that allows obtaining the maximum possible information with a minimum number of simulations, is quite fundamental. In the analysis of the results of simulations in which its input parameters are modified in a controlled way, the causes of the changes in the response can be easily identified. Because of this, the use of statistical analysis was considered. Statistical tools can be useful in organizing the simulations and in the subsequent analysis of the results. The Design of Experiments (DOE) is the statistical methodology to guarantee an efficient experimentation, allowing the identification of the statistically significant variables in any system or process, and the Response Surface Methodology allows the optimization of the systems using their statistically significant variables. Considering this, the main objective of this research was to present a new approach to optimize, through computational simulation with near field models and the use of statistical tools: Design of Experiments and Response Surfaces Methodology, the number of necessary simulations to obtain the initial dilution value closer to the optimum, in the design of submarine outfalls, through the identification of the statistically significant variables, in the prediction of the initial dilution, in the near field models. The results of the work showed that only three of the diffuser geometric variables are statistically significant for the initial dilution prediction, and that only a few statistically planned simulations are indispensable for obtaining a more efficient submarine outfall configuration.
46

Applications and optimization of response surface methodologies in high-pressure, high-temperature gauges

Hässig Fonseca, Santiago 05 July 2012 (has links)
High-Pressure, High-Temperature (HPHT) pressure gauges are commonly used in oil wells for pressure transient analysis. Mathematical models are used to relate input perturbation (e.g., flow rate transients) with output responses (e.g., pressure transients), and subsequently, solve an inverse problem that infers reservoir parameters. The indispensable use of pressure data in well testing motivates continued improvement in the accuracy (quality), sampling rate (quantity), and autonomy (lifetime) of pressure gauges. This body of work presents improvements in three areas of high-pressure, high-temperature quartz memory gauge technology: calibration accuracy, multi-tool signal alignment, and tool autonomy estimation. The discussion introduces the response surface methodology used to calibrate gauges, develops accuracy and autonomy estimates based on controlled tests, and where applicable, relies on field gauge drill stem test data to validate accuracy predictions. Specific contributions of this work include: - Application of the unpaired sample t-test, a first in quartz sensor calibration, which resulted in reduction of uncertainty in gauge metrology by a factor of 2.25, and an improvement in absolute and relative tool accuracies of 33% and 56%, accordingly. Greater accuracy yields more reliable data and a more sensitive characterization of well parameters. - Post-processing of measurements from 2+ tools using a dynamic time warp algorithm that mitigates gauge clock drifts. Where manual alignment methods account only for linear shifts, the dynamic algorithm elastically corrects nonlinear misalignments accumulated throughout a job with an accuracy that is limited only by the clock's time resolution. - Empirical modeling of tool autonomy based on gauge selection, battery pack, sampling mode, and average well temperature. A first of its kind, the model distills autonomy into two independent parameters, each a function of the same two orthogonal factors: battery power capacity and gauge current consumption as functions of sampling mode and well temperature -- a premise that, for 3+ gauge and battery models, reduces the design of future autonomy experiments by at least a factor of 1.5.
47

Analytical Fragility Curves for Highway Bridges in Moderate Seismic Zones

Nielson, Bryant G. 23 November 2005 (has links)
Historical seismic events such as the San Fernando earthquake of 1971 and the Loma Prieta earthquake of 1989 did much to highlight the vulnerabilities in many existing highway bridges. However, it was not until 1990 that this awareness extended to the moderate seismic regions such as the Central and Southeastern United States (CSUS). This relatively long neglect of seismic issues pertaining to bridges in these moderate seismic zones has resulted in a portfolio of existing bridges with seismic deficiencies which must be assessed and addressed. An emerging decision tool, whose use is becoming ever increasingly popular in the assessment of this seismic risk, is that of seismic fragility curves. Fragility curves are conditional probability statements which give the probability of a bridge reaching or exceeding a particular damage level for an earthquake of a given intensity level. As much research has been devoted to the implementation of fragility curves in risk assessment packages, a great need has arisen for bridge fragility curves which are reliable, particularly for those in moderate seismic zones. The purpose of this study is to use analytical methods to generate fragility curves for nine bridge classes which are most common to the CSUS. This is accomplished by first considering the existing bridge inventory and assessing typical characteristics and details from which detailed 3-D analytical models are created. The bridges are subjected to a suite of synthetic ground motions which were developed explicitly for the region. Probabilistic seismic demand models (PSDM) are then generated using these analyses. From these PSD models, fragility curves are generated by considering specific levels of damage which may be of interest. The fragility curves show that the most vulnerable of all the bridge nine bridge classes considered are those utilizing steel girders. Concrete girder bridges appear to be the next most vulnerable followed by single span bridges of all types. Various sources of uncertainty are considered and tracked throughout this study, which allows for their direct implementation into existing seismic risk assessment packages.
48

Seismic Vulnerability Assessment of Retrofitted Bridges Using Probabilistic Methods

Padgett, Jamie Ellen 09 April 2007 (has links)
The central focus of this dissertation is a seismic vulnerability assessment of retrofitted bridges. The objective of this work is to establish a methodology for the development of system level fragility curves for typical classes of retrofitted bridges using a probabilistic framework. These tools could provide valuable support for risk mitigation efforts in the region by quantifying the impact of retrofit on potential levels of damage over a range of earthquake intensities. The performance evaluation includes the development of high-fidelity three-dimensional nonlinear analytical models of bridges retrofit with a range of retrofit measures, and characterization of the response under seismic loading. Sensitivity analyses were performed to establish an understanding of the appropriate level of uncertainty treatment to model, assess, and propagate sources of uncertainty inherent to a seismic performance evaluation for portfolios of structures. Seismic fragility curves are developed to depict the impact of various retrofit devices on the seismic vulnerability of bridge systems. This work provides the first set of fragility curves for a range of bridge types and retrofit measures. Framework for their use in decision making for identification of viable retrofit measures, performance-based retrofit of bridges, and cost-benefit analyses are illustrated. The fragility curves developed as a part of this research will fill a major gap in existing seismic risk assessment software, and enable decision makers to quantify the benefits of various retrofits.
49

Parâmetros mais influentes na previsão da diluição inicial em sistemas de emissários submarinos: uma contribuição baseada em técnicas de planejamento de experimentos e superfícies de resposta. / More influential parameters in the prediction of initial dilution in submarine outfall systems: a contribution based on techniques of design of experiment and response surfaces.

Jacqueline Pedrera Yanes 14 December 2017 (has links)
As simulações computacionais são comumente empregadas no estudo do processo da dispersão da pluma de efluente de esgoto no mar como suporte no projeto de sistemas de emissários submarinos. Os modelos de campo próximo são uma ferramenta eficiente para prever o comportamento desses sistemas, através da previsão da diluição inicial, das principais dimensões da pluma e da concentração final de poluentes, informações úteis para o projeto do difusor. Usando modelos de campo próximo, como o NRFIELD e o UM3 do Visual Plumes, é possível simular diferentes alternativas de emissários. No entanto, a realização de muitas simulações resulta numa grande quantidade de informações que impede a obtenção de conclusões objetivas, consumindo muito tempo do projeto. Uma estratégia para um processo de simulação eficiente que permita a obtenção da máxima informação possível com um número mínimo de simulações é fundamental. Na análise dos resultados de simulações nas quais seus parâmetros de entrada são modificados de forma controlada, as causas das mudanças na resposta podem ser facilmente identificadas. Devido a isto, o uso da análise estatística foi considerado. As ferramentas estatísticas podem ser úteis na organização das simulações e na análise subsequente dos resultados. O Planejamento de Experimentos (DOE) é a metodologia estatística para uma experimentação eficiente, permitindo a identificação das variáveis estatisticamente significativas em qualquer sistema ou processo e, a Metodologia de Superfície de Resposta permite a otimização de sistemas a partir de suas variáveis estatisticamente significativas. Considerando isso, o objetivo principal desta pesquisa foi apresentar uma nova abordagem para otimizar, por meio da simulação computacional com modelos de campo próximo e da utilização das ferramentas estatísticas: Planejamento de Experimentos e Superfícies de Resposta, a quantidade de simulações necessárias para a obtenção do valor de diluição inicial mais próximo do ótimo, no projeto de emissários submarinos, através da identificação das variáveis estatisticamente significativas, na predição da diluição inicial, nos modelos de campo próximo do Visual Plumes. Os resultados do trabalho mostraram que só três variáveis geométricas, da configuração do difusor de um emissário, são estatisticamente significativas para a previsão da diluição inicial, e que apenas umas poucas simulações, estatisticamente planejadas, são indispensáveis para a obtenção de uma configuração de emissário mais eficiente. / The computational simulations are commonly employed in the study of the process of dispersion of sewage plume at sea as support in the design of submarine outfall systems. The near field models are an efficient tool to predict the behavior of these systems, by predicting the initial dilution, the main dimensions of the sewage plume and the final concentration of pollutants, useful information for the design of the diffuser. Using near field models such as NRFIELD and UM3 from Visual Plumes, it is possible to simulate different submarine outfall alternatives for their posterior comparison and analysis. However, when many simulations are conducted, it is resulted a large amount of information, which make more difficult to obtain objective conclusions and it is much more time consuming solution. A strategy for an efficient simulation process, that allows obtaining the maximum possible information with a minimum number of simulations, is quite fundamental. In the analysis of the results of simulations in which its input parameters are modified in a controlled way, the causes of the changes in the response can be easily identified. Because of this, the use of statistical analysis was considered. Statistical tools can be useful in organizing the simulations and in the subsequent analysis of the results. The Design of Experiments (DOE) is the statistical methodology to guarantee an efficient experimentation, allowing the identification of the statistically significant variables in any system or process, and the Response Surface Methodology allows the optimization of the systems using their statistically significant variables. Considering this, the main objective of this research was to present a new approach to optimize, through computational simulation with near field models and the use of statistical tools: Design of Experiments and Response Surfaces Methodology, the number of necessary simulations to obtain the initial dilution value closer to the optimum, in the design of submarine outfalls, through the identification of the statistically significant variables, in the prediction of the initial dilution, in the near field models. The results of the work showed that only three of the diffuser geometric variables are statistically significant for the initial dilution prediction, and that only a few statistically planned simulations are indispensable for obtaining a more efficient submarine outfall configuration.
50

Sequential Adaptive Designs In Computer Experiments For Response Surface Model Fit

LAM, CHEN QUIN 29 July 2008 (has links)
No description available.

Page generated in 0.0536 seconds