• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 11
  • 7
  • 2
  • 2
  • Tagged with
  • 63
  • 63
  • 44
  • 19
  • 16
  • 15
  • 14
  • 11
  • 10
  • 9
  • 9
  • 8
  • 7
  • 7
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Simulation-optimization studies: under efficient stimulationstrategies, and a novel response surface methodology algorithm

Joshi, Shirish 06 June 2008 (has links)
While attempting to solve optimization problems, the lack of an explicit mathematical expression of the problem may preclude the application of the standard methods of optimization which prove valuable in an analytical framework. In such situations, computer simulations are used to obtain the mean response values for the required settings of the independent variables. Procedures for optimizing on the mean response values, which are in turn obtained through computer simulation experiments, are called simulation-optimization techniques. The focus of this work is on the simulation-optimization technique of response surface methodology (RSM). RSM is a collection of mathematical and statistical techniques for experimental optimization. Correlation induction strategies can be employed in RSM to achieve improved statistical inferences on experimental designs and sequential experimentations. Also, the search procedures currently employed by RSM algorithms can be improved by incorporating gradient deflection methods. This dissertation has three major goals: (a) develop analytical results to quantitatively express the gains of using the common random number (CRN) strategy of variance reduction over direct simulation (independent streams or IS strategy) at each stage RSM, (b) develop a new RSM algorithm by incorporating gradient deflection methods in existing RSM algorithms, and (c) to conduct extensive empirical studies to quantify: (i) the use of eRN strategy over direct simulation in a standard RSM algorithm, and (ii) the gains of the new RSM algorithm over a standard existing RSM algorithm. / Ph. D.
42

Outliers and robust response surface designs

O'Gorman, Mary Ann January 1984 (has links)
A commonly occurring problem in response surface methodology is that of inconsistencies in the response variable. These inconsistencies, or maverick observations, are referred to here as outliers. Many models exist for describing these outliers. Two of these models, the mean shift and the variance inflation outlier models, are employed in this research. Several criteria are developed for determining when the outlying observation is detrimental to the analysis. These criteria all lead to the same condition which is used to develop statistical tests of the null hypothesis that the outlier is not detrimental to the analysis. These results are extended to the multiple outlier case for both models. The robustness of response surface designs is also investigated. Robustness to outliers, missing data and errors in control are examined for first order models. The orthogonal designs with large second moments, such as the 2ᵏ factorial designs, are optimal in all three cases. In the second order case, robustness to outliers and to missing data are examined. Optimal design parameters are obtained by computer for the central composite, Box-Behnken, hybrid, small composite and equiradial designs. Similar results are seen for both robustness to outliers and to missing data. The central composite turns out to be the optimal design type and of the two economical design types the small composite is preferred to the hybrid. / Ph. D.
43

Parâmetros mais influentes na previsão da diluição inicial em sistemas de emissários submarinos: uma contribuição baseada em técnicas de planejamento de experimentos e superfícies de resposta. / More influential parameters in the prediction of initial dilution in submarine outfall systems: a contribution based on techniques of design of experiment and response surfaces.

Yanes, Jacqueline Pedrera 14 December 2017 (has links)
As simulações computacionais são comumente empregadas no estudo do processo da dispersão da pluma de efluente de esgoto no mar como suporte no projeto de sistemas de emissários submarinos. Os modelos de campo próximo são uma ferramenta eficiente para prever o comportamento desses sistemas, através da previsão da diluição inicial, das principais dimensões da pluma e da concentração final de poluentes, informações úteis para o projeto do difusor. Usando modelos de campo próximo, como o NRFIELD e o UM3 do Visual Plumes, é possível simular diferentes alternativas de emissários. No entanto, a realização de muitas simulações resulta numa grande quantidade de informações que impede a obtenção de conclusões objetivas, consumindo muito tempo do projeto. Uma estratégia para um processo de simulação eficiente que permita a obtenção da máxima informação possível com um número mínimo de simulações é fundamental. Na análise dos resultados de simulações nas quais seus parâmetros de entrada são modificados de forma controlada, as causas das mudanças na resposta podem ser facilmente identificadas. Devido a isto, o uso da análise estatística foi considerado. As ferramentas estatísticas podem ser úteis na organização das simulações e na análise subsequente dos resultados. O Planejamento de Experimentos (DOE) é a metodologia estatística para uma experimentação eficiente, permitindo a identificação das variáveis estatisticamente significativas em qualquer sistema ou processo e, a Metodologia de Superfície de Resposta permite a otimização de sistemas a partir de suas variáveis estatisticamente significativas. Considerando isso, o objetivo principal desta pesquisa foi apresentar uma nova abordagem para otimizar, por meio da simulação computacional com modelos de campo próximo e da utilização das ferramentas estatísticas: Planejamento de Experimentos e Superfícies de Resposta, a quantidade de simulações necessárias para a obtenção do valor de diluição inicial mais próximo do ótimo, no projeto de emissários submarinos, através da identificação das variáveis estatisticamente significativas, na predição da diluição inicial, nos modelos de campo próximo do Visual Plumes. Os resultados do trabalho mostraram que só três variáveis geométricas, da configuração do difusor de um emissário, são estatisticamente significativas para a previsão da diluição inicial, e que apenas umas poucas simulações, estatisticamente planejadas, são indispensáveis para a obtenção de uma configuração de emissário mais eficiente. / The computational simulations are commonly employed in the study of the process of dispersion of sewage plume at sea as support in the design of submarine outfall systems. The near field models are an efficient tool to predict the behavior of these systems, by predicting the initial dilution, the main dimensions of the sewage plume and the final concentration of pollutants, useful information for the design of the diffuser. Using near field models such as NRFIELD and UM3 from Visual Plumes, it is possible to simulate different submarine outfall alternatives for their posterior comparison and analysis. However, when many simulations are conducted, it is resulted a large amount of information, which make more difficult to obtain objective conclusions and it is much more time consuming solution. A strategy for an efficient simulation process, that allows obtaining the maximum possible information with a minimum number of simulations, is quite fundamental. In the analysis of the results of simulations in which its input parameters are modified in a controlled way, the causes of the changes in the response can be easily identified. Because of this, the use of statistical analysis was considered. Statistical tools can be useful in organizing the simulations and in the subsequent analysis of the results. The Design of Experiments (DOE) is the statistical methodology to guarantee an efficient experimentation, allowing the identification of the statistically significant variables in any system or process, and the Response Surface Methodology allows the optimization of the systems using their statistically significant variables. Considering this, the main objective of this research was to present a new approach to optimize, through computational simulation with near field models and the use of statistical tools: Design of Experiments and Response Surfaces Methodology, the number of necessary simulations to obtain the initial dilution value closer to the optimum, in the design of submarine outfalls, through the identification of the statistically significant variables, in the prediction of the initial dilution, in the near field models. The results of the work showed that only three of the diffuser geometric variables are statistically significant for the initial dilution prediction, and that only a few statistically planned simulations are indispensable for obtaining a more efficient submarine outfall configuration.
44

Applications and optimization of response surface methodologies in high-pressure, high-temperature gauges

Hässig Fonseca, Santiago 05 July 2012 (has links)
High-Pressure, High-Temperature (HPHT) pressure gauges are commonly used in oil wells for pressure transient analysis. Mathematical models are used to relate input perturbation (e.g., flow rate transients) with output responses (e.g., pressure transients), and subsequently, solve an inverse problem that infers reservoir parameters. The indispensable use of pressure data in well testing motivates continued improvement in the accuracy (quality), sampling rate (quantity), and autonomy (lifetime) of pressure gauges. This body of work presents improvements in three areas of high-pressure, high-temperature quartz memory gauge technology: calibration accuracy, multi-tool signal alignment, and tool autonomy estimation. The discussion introduces the response surface methodology used to calibrate gauges, develops accuracy and autonomy estimates based on controlled tests, and where applicable, relies on field gauge drill stem test data to validate accuracy predictions. Specific contributions of this work include: - Application of the unpaired sample t-test, a first in quartz sensor calibration, which resulted in reduction of uncertainty in gauge metrology by a factor of 2.25, and an improvement in absolute and relative tool accuracies of 33% and 56%, accordingly. Greater accuracy yields more reliable data and a more sensitive characterization of well parameters. - Post-processing of measurements from 2+ tools using a dynamic time warp algorithm that mitigates gauge clock drifts. Where manual alignment methods account only for linear shifts, the dynamic algorithm elastically corrects nonlinear misalignments accumulated throughout a job with an accuracy that is limited only by the clock's time resolution. - Empirical modeling of tool autonomy based on gauge selection, battery pack, sampling mode, and average well temperature. A first of its kind, the model distills autonomy into two independent parameters, each a function of the same two orthogonal factors: battery power capacity and gauge current consumption as functions of sampling mode and well temperature -- a premise that, for 3+ gauge and battery models, reduces the design of future autonomy experiments by at least a factor of 1.5.
45

Analytical Fragility Curves for Highway Bridges in Moderate Seismic Zones

Nielson, Bryant G. 23 November 2005 (has links)
Historical seismic events such as the San Fernando earthquake of 1971 and the Loma Prieta earthquake of 1989 did much to highlight the vulnerabilities in many existing highway bridges. However, it was not until 1990 that this awareness extended to the moderate seismic regions such as the Central and Southeastern United States (CSUS). This relatively long neglect of seismic issues pertaining to bridges in these moderate seismic zones has resulted in a portfolio of existing bridges with seismic deficiencies which must be assessed and addressed. An emerging decision tool, whose use is becoming ever increasingly popular in the assessment of this seismic risk, is that of seismic fragility curves. Fragility curves are conditional probability statements which give the probability of a bridge reaching or exceeding a particular damage level for an earthquake of a given intensity level. As much research has been devoted to the implementation of fragility curves in risk assessment packages, a great need has arisen for bridge fragility curves which are reliable, particularly for those in moderate seismic zones. The purpose of this study is to use analytical methods to generate fragility curves for nine bridge classes which are most common to the CSUS. This is accomplished by first considering the existing bridge inventory and assessing typical characteristics and details from which detailed 3-D analytical models are created. The bridges are subjected to a suite of synthetic ground motions which were developed explicitly for the region. Probabilistic seismic demand models (PSDM) are then generated using these analyses. From these PSD models, fragility curves are generated by considering specific levels of damage which may be of interest. The fragility curves show that the most vulnerable of all the bridge nine bridge classes considered are those utilizing steel girders. Concrete girder bridges appear to be the next most vulnerable followed by single span bridges of all types. Various sources of uncertainty are considered and tracked throughout this study, which allows for their direct implementation into existing seismic risk assessment packages.
46

Seismic Vulnerability Assessment of Retrofitted Bridges Using Probabilistic Methods

Padgett, Jamie Ellen 09 April 2007 (has links)
The central focus of this dissertation is a seismic vulnerability assessment of retrofitted bridges. The objective of this work is to establish a methodology for the development of system level fragility curves for typical classes of retrofitted bridges using a probabilistic framework. These tools could provide valuable support for risk mitigation efforts in the region by quantifying the impact of retrofit on potential levels of damage over a range of earthquake intensities. The performance evaluation includes the development of high-fidelity three-dimensional nonlinear analytical models of bridges retrofit with a range of retrofit measures, and characterization of the response under seismic loading. Sensitivity analyses were performed to establish an understanding of the appropriate level of uncertainty treatment to model, assess, and propagate sources of uncertainty inherent to a seismic performance evaluation for portfolios of structures. Seismic fragility curves are developed to depict the impact of various retrofit devices on the seismic vulnerability of bridge systems. This work provides the first set of fragility curves for a range of bridge types and retrofit measures. Framework for their use in decision making for identification of viable retrofit measures, performance-based retrofit of bridges, and cost-benefit analyses are illustrated. The fragility curves developed as a part of this research will fill a major gap in existing seismic risk assessment software, and enable decision makers to quantify the benefits of various retrofits.
47

Parâmetros mais influentes na previsão da diluição inicial em sistemas de emissários submarinos: uma contribuição baseada em técnicas de planejamento de experimentos e superfícies de resposta. / More influential parameters in the prediction of initial dilution in submarine outfall systems: a contribution based on techniques of design of experiment and response surfaces.

Jacqueline Pedrera Yanes 14 December 2017 (has links)
As simulações computacionais são comumente empregadas no estudo do processo da dispersão da pluma de efluente de esgoto no mar como suporte no projeto de sistemas de emissários submarinos. Os modelos de campo próximo são uma ferramenta eficiente para prever o comportamento desses sistemas, através da previsão da diluição inicial, das principais dimensões da pluma e da concentração final de poluentes, informações úteis para o projeto do difusor. Usando modelos de campo próximo, como o NRFIELD e o UM3 do Visual Plumes, é possível simular diferentes alternativas de emissários. No entanto, a realização de muitas simulações resulta numa grande quantidade de informações que impede a obtenção de conclusões objetivas, consumindo muito tempo do projeto. Uma estratégia para um processo de simulação eficiente que permita a obtenção da máxima informação possível com um número mínimo de simulações é fundamental. Na análise dos resultados de simulações nas quais seus parâmetros de entrada são modificados de forma controlada, as causas das mudanças na resposta podem ser facilmente identificadas. Devido a isto, o uso da análise estatística foi considerado. As ferramentas estatísticas podem ser úteis na organização das simulações e na análise subsequente dos resultados. O Planejamento de Experimentos (DOE) é a metodologia estatística para uma experimentação eficiente, permitindo a identificação das variáveis estatisticamente significativas em qualquer sistema ou processo e, a Metodologia de Superfície de Resposta permite a otimização de sistemas a partir de suas variáveis estatisticamente significativas. Considerando isso, o objetivo principal desta pesquisa foi apresentar uma nova abordagem para otimizar, por meio da simulação computacional com modelos de campo próximo e da utilização das ferramentas estatísticas: Planejamento de Experimentos e Superfícies de Resposta, a quantidade de simulações necessárias para a obtenção do valor de diluição inicial mais próximo do ótimo, no projeto de emissários submarinos, através da identificação das variáveis estatisticamente significativas, na predição da diluição inicial, nos modelos de campo próximo do Visual Plumes. Os resultados do trabalho mostraram que só três variáveis geométricas, da configuração do difusor de um emissário, são estatisticamente significativas para a previsão da diluição inicial, e que apenas umas poucas simulações, estatisticamente planejadas, são indispensáveis para a obtenção de uma configuração de emissário mais eficiente. / The computational simulations are commonly employed in the study of the process of dispersion of sewage plume at sea as support in the design of submarine outfall systems. The near field models are an efficient tool to predict the behavior of these systems, by predicting the initial dilution, the main dimensions of the sewage plume and the final concentration of pollutants, useful information for the design of the diffuser. Using near field models such as NRFIELD and UM3 from Visual Plumes, it is possible to simulate different submarine outfall alternatives for their posterior comparison and analysis. However, when many simulations are conducted, it is resulted a large amount of information, which make more difficult to obtain objective conclusions and it is much more time consuming solution. A strategy for an efficient simulation process, that allows obtaining the maximum possible information with a minimum number of simulations, is quite fundamental. In the analysis of the results of simulations in which its input parameters are modified in a controlled way, the causes of the changes in the response can be easily identified. Because of this, the use of statistical analysis was considered. Statistical tools can be useful in organizing the simulations and in the subsequent analysis of the results. The Design of Experiments (DOE) is the statistical methodology to guarantee an efficient experimentation, allowing the identification of the statistically significant variables in any system or process, and the Response Surface Methodology allows the optimization of the systems using their statistically significant variables. Considering this, the main objective of this research was to present a new approach to optimize, through computational simulation with near field models and the use of statistical tools: Design of Experiments and Response Surfaces Methodology, the number of necessary simulations to obtain the initial dilution value closer to the optimum, in the design of submarine outfalls, through the identification of the statistically significant variables, in the prediction of the initial dilution, in the near field models. The results of the work showed that only three of the diffuser geometric variables are statistically significant for the initial dilution prediction, and that only a few statistically planned simulations are indispensable for obtaining a more efficient submarine outfall configuration.
48

Sequential Adaptive Designs In Computer Experiments For Response Surface Model Fit

LAM, CHEN QUIN 29 July 2008 (has links)
No description available.
49

An efficient technique for structural reliability with applications

Janajreh, Ibrahim Mustafa 28 July 2008 (has links)
An efficient reliability technique has been developed based on Response Surface Methodology (RSM) in conjunction with the First Order Second Moment (FOSM) reliability method. The technique is applied when the limit state function cannot be obtained explicitly in terms of the design variables, i.e., when the analysis is performed using numerical techniques such as finite elements. The technique has proven to be efficient because it can handle problems with large numbers of design variables and correlated as well as nonnormal random variables. When compared with analytical results, the method has shown excellent agreement. The technique contains a sensitivity analysis scheme which can be used to reduce the computation time resulting in nearly the same accuracy. This technique allows the extension of most finite element codes to account for probabilistic analysis, where statistical variations can be added to the design variables. An explicit solution for rocket motors consisting of propellant and steel case under environmental temperature variations is compared to the RSM technique. The method is then used for the analysis of rocket motors subjected to mechanical loads for which the stress analysis is performed using the finite element method. The technique is also applied to study the reliability of a laminated composite plate with geometric nonlinearity subjected to static and time dependent loadings. Different failure modes were considered as well as different meshes. Results have shown that when the relative size of the element is introduced into the probabilistic model, the same reliability value is obtained regardless of the number of elements in the mesh. This is good because it allows the technique to be used for problems where the failure region is unknown. / Ph. D.
50

Progressive Validity Metamodel Trust Region Optimization

Thomson, Quinn Parker 26 February 2009 (has links)
The goal of this work was to develop metamodels of the MDO framework piMDO and provide new research in metamodeling strategies. The theory of existing metamodels is presented and implementation details are given. A new trust region scheme --- metamodel trust region optimization (MTRO) --- was developed. This method uses a progressive level of minimum validity in order to reduce the number of sample points required for the optimization process. Higher levels of validity require denser point distributions, but the reducing size of the region during the optimization process mitigates an increase the number of points required. New metamodeling strategies include: inherited optimal latin hypercube sampling, hybrid latin hypercube sampling, and kriging with BFGS. MTRO performs better than traditional trust region methods for single discipline problems and is competitive against other MDO architectures when used with a CSSO algorithm. Advanced metamodeling methods proved to be inefficient in trust region methods.

Page generated in 0.0822 seconds