• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 21
  • 7
  • 3
  • 2
  • 2
  • Tagged with
  • 45
  • 45
  • 28
  • 14
  • 11
  • 8
  • 7
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Improved accuracy of surrogate models using output postprocessing

Andersson, Daniel January 2007 (has links)
Using surrogate approximations (e.g. Kriging interpolation or artifical neural networks) is an established technique for decreasing the execution time of simulation optimization problems. However, constructing surrogate approximations can be impossible when facing complex simulation inputs, and instead one is forced to use a surrogate model, which explicitly attempts to simulate the inner workings of the underlying simulation model. This dissertation has investigated if postprocessing the output of a surrogate model with an artificial neural network can increase its accuracy and value in simulation optimization problems. Results indicate that the technique has potential in that when output post-processing was enabled the accuracy of the surrogate model increased, i.e. its output more losely matched the output of the real simulation model. No apparent improvement in optimization performance could be observed however. It was speculated that this was due to either the optimization algorithm used not taking advantage of the improved accuracy of the surrogate model, or the fact the the improved accuracy of the surrogate model was to small to make any measurable impact. Further investigation of these issues must be conducted in order to get a better understanding of the pros and cons of the technique.
12

Comparing Random forest and Kriging Methods for Surrogate Modeling

Asritha, Kotha Sri Lakshmi Kamakshi January 2020 (has links)
The issue with conducting real experiments in design engineering is the cost factor to find an optimal design that fulfills all design requirements and constraints. An alternate method of a real experiment that is performed by engineers is computer-aided design modeling and computer-simulated experiments. These simulations are conducted to understand functional behavior and to predict possible failure modes in design concepts. However, these simulations may take minutes, hours, days to finish. In order to reduce the time consumption and simulations required for design space exploration, surrogate modeling is used. \par Replacing the original system is the motive of surrogate modeling by finding an approximation function of simulations that is quickly computed. The process of surrogate model generation includes sample selection, model generation, and model evaluation. Using surrogate models in design engineering can help reduce design cycle times and cost by enabling rapid analysis of alternative designs.\par Selecting a suitable surrogate modeling method for a given function with specific requirements is possible by comparing different surrogate modeling methods. These methods can be compared using different application problems and evaluation metrics. In this thesis, we are comparing the random forest model and kriging model based on prediction accuracy. The comparison is performed using mathematical test functions. This thesis conducted quantitative experiments to investigate the performance of methods. After experimental analysis, it is found that the kriging models have higher accuracy compared to random forests. Furthermore, the random forest models have less execution time compared to kriging for studied mathematical test problems.
13

Optimisation de structures viscoplastiques par couplage entre métamodèle multi-fidélité et modèles réduits / Structural design optimization by coupling multi-fidelity metamodels and reduced-order models

Nachar, Stéphane 11 October 2019 (has links)
Les phases de conception et de validation de pièces mécaniques nécessitent des outils de calculs rapides et fiables, permettant de faire des choix technologiques en un temps court. Dans ce cadre, il n'est pas possible de calculer la réponse exacte pour l'ensemble des configurations envisageables. Les métamodèles sont alors couramment utilisés mais nécessitent un grand nombre de réponses, notamment dans le cas où celles-ci sont non-linéaires. Une solution est alors d'exploiter plusieurs sources de données de qualité diverses pour générer un métamodèle multi-fidélité plus rapide à calculer pour une précision équivalente. Ces données multi-fidélité peuvent être extraites de modèles réduits.Les travaux présentés proposent une méthode de génération de métamodèles multi-fidélité pour l'optimisation de structures mécaniques par la mise en place d'une stratégie d'enrichissement adaptatif des informations sur la réponse de la structure, par utilisation de données issues d'un solveur LATIN-PGD permettant de générer des données de qualités adaptées, et d'accélérer le calcul par la réutilisation des données précédemment calculées. Un grand nombre de données basse-fidélité sont calculées avant un enrichissement intelligent par des données haute-fidélité.Ce manuscrit présente les contributions aux métamodèles multi-fidélité et deux approches de la méthode LATIN-PGD avec la mise en place d'une stratégie multi-paramétrique pour le réemploi des données précédemment calculées. Une implémentation parallèle des méthodes a permis de tester la méthode sur trois cas-tests, pour des gains pouvant aller jusqu'à 37x. / Engineering simulation provides the best design products by allowing many design options to be quickly explored and tested, but fast-time-to-results requirement remains a critical factor to meet aggressive time-to-market requirements. In this context, using high-fidelity direct resolution solver is not suitable for (virtual) charts generation for engineering design and optimization.Metamodels are commonly considered to explore design options without computing every possibility, but if the behavior is nonlinear, a large amount of data is still required. A possibility is to use further data sources to generate a multi-fidelity surrogate model by using model reduction. Model reduction techniques constitute one of the tools to bypass the limited calculation budget by seeking a solution to a problem on a reduced order basis (ROB).The purpose of the present work is an online method for generating a multi-fidelity metamodel nourished by calculating the quantity of interest from the basis generated on-the-fly with the LATIN-PGD framework for elasto-viscoplastic problems. Low-fidelity fields are obtained by stopping the solver before convergence, and high-fidelity information is obtained with converged solution. In addition, the solver ability to reuse information from previously calculated PGD basis is exploited.This manuscript presents the contributions to multi-fidelity metamodels and the LATIN-PGD method with the implementation of a multi-parametric strategy. This coupling strategy was tested on three test cases for calculation time savings of more than 37x.
14

Surrogate Models for Seismic Response of Structures

Sanjay Nayak (16760970) 04 August 2023 (has links)
<p>The seismic risks to a structure or a set of structures in a region are usually determined by generating fragility curves that provide the probability of a building responding in a certain manner for a given level of ground motion intensity. Developing fragility curves, however, is challenging as it involves the computationally expensive task of obtaining the maximum response of the selected structures to a suite of ground motions representing the seismic hazard of the region selected. </p><p>This study presents a methodology to develop surrogate models for the prediction of the maximum responses of buildings to ground motion excitation. Data-driven surrogate models using simple machine learning techniques and physics-based surrogate models using the space mapping technique to map the low-fidelity responses obtained using a multi-degree of freedom shear building model to the high-fidelity values are developed for the prediction of the maximum roof drift ratio and the maximum story drift ratio of a chosen 15-story steel moment-resisting frame building with varying structural properties in California. The predictions of each of these surrogate models are analyzed to assess and compare the performance, capabilities, and limitations of these models. Best practices for developing surrogate models for the prediction of maximum responses of structures to ground motion are recommended.</p><p>The results from the development of data-driven surrogate models show that the spectral displacement is the best intensity measure to condition the maximum roof drift ratio, and the spectral velocity is the best intensity measure to condition the maximum story drift ratio. Fragility analysis of the structure is thus conducted using maximum story drift as the engineering demand parameter and spectral velocity as the intensity measure. Monte Carlo simulation is conducted using the physics-based surrogate model to estimate the maximum story drifts for ground motions that are incrementally scaled to different intensity levels. Maximum likelihood estimates are used to obtain the parameters for a lognormal distribution and the 95% confidence intervals are obtained using the Wald confidence interval to plot the fragility curves.</p><p>Fragility curves are plotted both with and without variations in the structural properties of the building, and it is found that the effects of variability in ground motions on the fragility are far higher than the effects of the randomness of structural properties. Finally, it is found that about 65 ground motion records are needed for convergence of the parameters of the lognormal distribution for plotting fragility curves by using Monte Carlo simulation.</p>
15

Neural Networks as Surrogates for Computational Fluid Dynamics Predictions of Hypersonic Flows

Minsavage, Kaitlyn Emily January 2020 (has links)
No description available.
16

Optimization of chemical process simulation: Application to the optimal rigorous design of natural gas liquefaction processes

Santos, Lucas F. 30 June 2023 (has links)
Designing products and processes is a fundamental aspect of engineering that significantly impacts society and the world. Chemical process design aims to create more efficient and sustainable production processes that consume fewer resources and emit less pollution. Mathematical models that accurately describe process behavior are necessary to make informed and responsible decisions. However, as processes become more complex, purely symbolic formulations may be inadequate, and simulations using tailored computer code become necessary. The decision‐making process in optimal design requires a procedure for choosing the best option while complying with the system’s constraints, for which task optimization approaches are well suited. This doctoral thesis focuses on black‐box optimization problems that arise when using process simulators in optimal process design tasks and assesses the potential of derivative‐free, metaheuristics, and surrogate‐based optimization approaches. The optimal design of natural gas liquefaction processes is the case study of this research. To overcome numerical issues from black‐box problems, the first work of this doctoral thesis consisted of using the globally convergent Nelder‐Mead simplex method to the optimal process design problem. The second work introduced surrogate models to assist the search towards the global optimum of the black‐box problem and an adaptive sampling scheme comprising the optimization of an acquisition function with metaheuristics. Kriging as surrogate models to the simulation‐optimization problems are computationally cheaper and effective predictors suitable for global search. The third work aims to overcome the limitations of acquisition function optimization and the use of metaheuristics. The proposed comprehensive mathematical notation of the surrogate optimization problem was readily implementable in algebraic modeling language software. The presented framework includes kriging models of the objective and constraint functions, an adaptive sampling procedure, a heuristic for stopping criteria, and a readily solvable surrogate optimization problem with mathematical programming. The success of the surrogate‐based optimization framework relies on the kriging models’ prediction accuracy regarding the underlying, simulation‐based functions. The fourth publication extends the previous work to multi‐objective black‐box optimization problems. It applies the ε constraint method to transform the multi‐objective surrogate optimization problem into a sequence of single‐objective ones. The ε‐constrained surrogate optimization problems are implemented automatically in algebraic modeling language software and solved using a gradient‐based, state‐of‐the‐art solver. The fifth publication is application-driven and focuses on identifying the most suitable mixed‐refrigerant refrigeration technology for natural gas liquefaction in terms of energy consumption and costs. The study investigates five natural gas liquefaction processes using particle swarm optimization and concludes that there are flaws in the expected relationships between process complexity, energy consumption, and total annualized costs. In conclusion, the research conducted in this doctoral thesis demonstrates the importance and capabilities of using optimization to process simulators. The work presented here highlights the potential of surrogate‐based optimization approaches to significantly reduce the computational cost and guide the search in black‐box optimization problems with chemical process simulators embedded. Overall, this doctoral thesis contributes to developing optimization strategies for complex chemical processes that are essential for addressing some of the current most pressing environmental and social challenges. The methods and insights presented in this work can help engineers and scientists design more sustainable and efficient processes, contributing to a better future for all.
17

Application and Evaluation of Full-Field Surrogate Models in Engineering Design Space Exploration

Thelin, Christopher Murray 01 July 2019 (has links)
When designing an engineering part, better decisions are made by exploring the entire space of design variations. This design space exploration (DSE) may be accomplished manually or via optimization. In engineering, evaluating a design during DSE often consists of running expensive simulations, such as finite element analysis (FEA) in order to understand the structural response to design changes. The computational cost of these simulations can make thorough DSE infeasible, and only a relatively small subset of the designs are explored. Surrogate models have been used to make cheap predictions of certain simulation results. Commonly, these models only predict single values (SV) that are meant to represent an entire part's response, such as a maximum stress or average displacement. However, these single values cannot return a complete prediction of the detailed nodal results of these simulations. Recently, surrogate models have been developed that can predict the full field (FF) of nodal responses. These FF surrogate models have the potential to make thorough and detailed DSE much more feasible and introduce further design benefits. However, these FF surrogate models have not yet been applied to real engineering activities or been demonstrated in DSE contexts, nor have they been directly compared with SV surrogate models in terms of accuracy and benefits.This thesis seeks to build confidence in FF surrogate models for engineering work by applying FF surrogate models to real DSE and engineering activities and exploring their comparative benefits with SV surrogate models. A user experiment which explores the effects of FF surrogate models in simple DSE activities helps to validate previous claims that FF surrogate models can enable interactive DSE. FF surrogate models are used to create Goodman diagrams for fatigue analysis, and found to be more accurate than SV surrogate models in predicting fatigue risk. Mode shapes are predicted and the accuracy of mode comparison predictions are found to require a larger amount of training samples when the data is highly nonlinear than do SV surrogate models. Finally, FF surrogate models enable spatially-defined objectives and constraints in optimization routines that efficiently search a design space and improve designs.The studies in this work present many unique FF-enabled design benefits for real engineering work. These include predicting a complete (rather than a summary) response, enabling interactive DSE of complex simulations, new three-dimensional visualizations of analysis results, and increased accuracy.
18

Machine Learning from Computer Simulations with Applications in Rail Vehicle Dynamics and System Identification

Taheri, Mehdi 01 July 2016 (has links)
The application of stochastic modeling for learning the behavior of multibody dynamics models is investigated. The stochastic modeling technique is also known as Kriging or random function approach. Post-processing data from a simulation run is used to train the stochastic model that estimates the relationship between model inputs, such as the suspension relative displacement and velocity, and the output, for example, sum of suspension forces. Computational efficiency of Multibody Dynamics (MBD) models can be improved by replacing their computationally-intensive subsystems with stochastic predictions. The stochastic modeling technique is able to learn the behavior of a physical system and integrate its behavior in MBS models, resulting in improved real-time simulations and reduced computational effort in models with repeated substructures (for example, modeling a train with a large number of rail vehicles). Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, various sampling plans are investigated, and a space-filling Latin Hypercube sampling plan based on the traveling salesman problem (TPS) is suggested for efficiently representing the entire parameter space. The simulation results confirm the expected increased modeling efficiency, although further research is needed for improving the accuracy of the predictions. The prediction accuracy is expected to improve through employing a sampling strategy that considers the discrete nature of the training data and uses infill criteria that considers the shape of the output function and detects sample spaces with high prediction errors. It is recommended that future efforts consider quantifying the computation efficiency of the proposed learning behavior by overcoming the inefficiencies associated with transferring data between multiple software packages, which proved to be a limiting factor in this study. These limitations can be overcome by using the user subroutine functionality of SIMPACK and adding the stochastic modeling technique to its force library. / Ph. D.
19

Linear Parameter Uncertainty Quantification using Surrogate Gaussian Processes

Macatula, Romcholo Yulo 21 July 2020 (has links)
We consider uncertainty quantification using surrogate Gaussian processes. We take a previous sampling algorithm and provide a closed form expression of the resulting posterior distribution. We extend the method to weighted least squares and a Bayesian approach both with closed form expressions of the resulting posterior distributions. We test methods on 1D deconvolution and 2D tomography. Our new methods improve on the previous algorithm, however fall short in some aspects to a typical Bayesian inference method. / Master of Science / Parameter uncertainty quantification seeks to determine both estimates and uncertainty regarding estimates of model parameters. Example of model parameters can include physical properties such as density, growth rates, or even deblurred images. Previous work has shown that replacing data with a surrogate model can provide promising estimates with low uncertainty. We extend the previous methods in the specific field of linear models. Theoretical results are tested on simulated computed tomography problems.
20

High-fidelity multidisciplinary design optimization of a 3D composite material hydrofoil

Volpi, Silvia 01 May 2018 (has links)
Multidisciplinary design optimization (MDO) refers to the process of designing systems characterized by the interaction of multiple interconnected disciplines. High-fidelity MDO usually requires large computational resources due to the computational cost of achieving multidisciplinary consistent solutions by coupling high-fidelity physics-based solvers. Gradient-based minimization algorithms are generally applied to find local minima, due to their efficiency in solving problems with a large number of design variables. This represents a limitation to performing global MDO and integrating black-box type analysis tools, usually not providing gradient information. The latter issues generally inhibit a wide use of MDO in complex industrial applications. An architecture named multi-criterion adaptive sampling MDO (MCAS-MDO) is presented in the current research for complex simulation-based applications. This research aims at building a global derivative-free optimization tool able to employ high-fidelity/expensive black-box solvers for the analysis of the disciplines. MCAS-MDO is a surrogate-based architecture featuring a variable level of coupling among the disciplines and is driven by a multi-criterion adaptive sampling (MCAS) assessing coupling and sampling uncertainties. MCAS uses the dynamic radial basis function surrogate model to identify the optimal solution and explore the design space through parallel infill of new solutions. The MCAS-MDO is tested versus a global derivative-free multidisciplinary feasible (MDF) approach, which solves fully-coupled multidisciplinary analyses, for two analytical test problems. Evaluation metrics include number of function evaluations required to achieve the optimal solution and sample distribution. The MCAS-MDO outperforms the MDF showing a faster convergence by clustering refined function evaluations in the optimum region. The architecture is applied to a steady fluid-structure interaction (FSI) problem, namely the design of a tapered three-dimensional carbon fiber-reinforced plastic hydrofoil for minimum drag. The objective is the design of shape and composite material layout subject to hydrodynamic, structural, and geometrical constraints. Experimental data are available for the original configuration of the hydrofoil and allow validating the FSI analysis, which is performed coupling computational fluid dynamics, solving the Reynolds averaged Navier-Stokes equations, and finite elements, solving the structural equation of elastic motion. Hydrofoil forces, tip displacement, and tip twist are evaluated for several materials providing qualitative agreement with the experiments and confirming the need for the two-way versus one-way coupling approach in case of significantly compliant structures. The free-form deformation method is applied to generate shape modifications of the hydrofoil geometry. To reduce the global computational expense of the optimization, a design space assessment and dimensionality reduction based on the Karhunen–Loève expansion (KLE) is performed off-line, i.e. without the need for high-fidelity simulations. It provides with a selection of design variables for the problem at hand through basis rotation and re-parametrization. By using the KLE, an efficient design space is identified for the current problem and the number of design variables is reduced by 92%. A sensitivity analysis is performed prior to the optimization to assess the variability associated with the shape design variables and the composite material design variable, i.e. the fiber orientation. These simulations are used to initialize the surrogate model for the optimization, which is carried out for two models: one in aluminum and one in composite material. The optimized designs are assessed by comparison with the original models through evaluation of the flow field, pressure distribution on the body, and deformation under the hydrodynamic load. The drag of the aluminum and composite material hydrofoils is reduced by 4 and 11%, respectively, increasing the hydrodynamic efficiency by 4 and 7%. The optimized designs are obtained by evaluating approximately 100 designs. The quality of the results indicates that global derivative-free MDO of complex engineering applications using expensive black-box solvers can be achieved at a feasible computational cost by minimizing the design space dimensionality and performing an intelligent sampling to train the surrogate-based optimization.

Page generated in 0.0605 seconds