21 |
Novel computational methods for stochastic design optimization of high-dimensional complex systemsRen, Xuchun 01 January 2015 (has links)
The primary objective of this study is to develop new computational methods for robust design optimization (RDO) and reliability-based design optimization (RBDO) of high-dimensional, complex engineering systems. Four major research directions, all anchored in polynomial dimensional decomposition (PDD), have been defined to meet the objective. They involve: (1) development of new sensitivity analysis methods for RDO and RBDO; (2) development of novel optimization methods for solving RDO problems; (3) development of novel optimization methods for solving RBDO problems; and (4) development of a novel scheme and formulation to solve stochastic design optimization problems with both distributional and structural design parameters.
The major achievements are as follows. Firstly, three new computational methods were developed for calculating design sensitivities of statistical moments and reliability of high-dimensional complex systems subject to random inputs. The first method represents a novel integration of PDD of a multivariate stochastic response function and score functions, leading to analytical expressions of design sensitivities of the first two moments. The second and third methods, relevant to probability distribution or reliability analysis, exploit two distinct combinations built on PDD: the PDD-SPA method, entailing the saddlepoint approximation (SPA) and score functions; and the PDD-MCS method, utilizing the embedded Monte Carlo simulation (MCS) of the PDD approximation and score functions. For all three methods developed, both the statistical moments or failure probabilities and their design sensitivities are both determined concurrently from a single stochastic analysis or simulation. Secondly, four new methods were developed for RDO of complex engineering systems. The methods involve PDD of a high-dimensional stochastic response for statistical moment analysis, a novel integration of PDD and score functions for calculating the second-moment sensitivities with respect to the design variables, and standard gradient-based optimization algorithms. The methods, depending on how statistical moment and sensitivity analyses are dovetailed with an optimization algorithm, encompass direct, single-step, sequential, and multi-point single-step design processes. Thirdly, two new methods were developed for RBDO of complex engineering systems. The methods involve an adaptive-sparse polynomial dimensional decomposition (AS-PDD) of a high-dimensional stochastic response for reliability analysis, a novel integration of AS-PDD and score functions for calculating the sensitivities of the failure probability with respect to design variables, and standard gradient-based optimization algorithms, resulting in a multi-point, single-step design process. The two methods, depending on how the failure probability and its design sensitivities are evaluated, exploit two distinct combinations built on AS-PDD: the AS-PDD-SPA method, entailing SPA and score functions; and the AS-PDD-MCS method, utilizing the embedded MCS of the AS-PDD approximation and score functions. In addition, a new method, named as the augmented PDD method, was developed for RDO and RBDO subject to mixed design variables, comprising both distributional and structural design variables. The method comprises a new augmented PDD of a high-dimensional stochastic response for statistical moment and reliability analyses; an integration of the augmented PDD, score functions, and finite-difference approximation for calculating the sensitivities of the first two moments and the failure probability with respect to distributional and structural design variables; and standard gradient-based optimization algorithms, leading to a multi-point, single-step design process.
The innovative formulations of statistical moment and reliability analysis, design sensitivity analysis, and optimization algorithms have achieved not only highly accurate but also computationally efficient design solutions. Therefore, these new methods are capable of performing industrial-scale design optimization with numerous design variables.
|
22 |
Confidence-based model validation for reliability assessment and its integration with reliability-based design optimizationMoon, Min-Yeong 01 August 2017 (has links)
Conventional reliability analysis methods assume that a simulation model is able to represent the real physics accurately. However, this assumption may not always hold as the simulation model could be biased due to simplifications and idealizations. Simulation models are approximate mathematical representations of real-world systems and thus cannot exactly imitate the real-world systems. The accuracy of a simulation model is especially critical when it is used for the reliability calculation. Therefore, a simulation model should be validated using prototype testing results for reliability analysis. However, in practical engineering situation, experimental output data for the purpose of model validation is limited due to the significant cost of a large number of physical testing. Thus, the model validation needs to be carried out to account for the uncertainty induced by insufficient experimental output data as well as the inherent variability existing in the physical system and hence in the experimental test results. Therefore, in this study, a confidence-based model validation method that captures the variability and the uncertainty, and that corrects model bias at a user-specified target confidence level, has been developed. Reliability assessment using the confidence-based model validation can provide conservative estimation of the reliability of a system with confidence when only insufficient experimental output data are available.
Without confidence-based model validation, the designed product obtained using the conventional reliability-based design optimization (RBDO) optimum could either not satisfy the target reliability or be overly conservative. Therefore, simulation model validation is necessary to obtain a reliable optimum product using the RBDO process. In this study, the developed confidence-based model validation is integrated in the RBDO process to provide truly confident RBDO optimum design. The developed confidence-based model validation will provide a conservative RBDO optimum design at the target confidence level. However, it is challenging to obtain steady convergence in the RBDO process with confidence-based model validation because the feasible domain changes as the design moves (i.e., a moving-target problem). To resolve this issue, a practical optimization procedure, which terminates the RBDO process once the target reliability is satisfied, is proposed. In addition, the efficiency is achieved by carrying out deterministic design optimization (DDO) and RBDO without model validation, followed by RBDO with the confidence-based model validation. Numerical examples are presented to demonstrate that the proposed RBDO approach obtains a conservative and practical optimum design that satisfies the target reliability of designed product given a limited number of experimental output data.
Thus far, while the simulation model might be biased, it is assumed that we have correct distribution models for input variables and parameters. However, in real practical applications, only limited numbers of test data are available (parameter uncertainty) for modeling input distributions of material properties, manufacturing tolerances, operational loads, etc. Also, as before, only a limited number of output test data is used. Therefore, a reliability needs to be estimated by considering parameter uncertainty as well as biased simulation model. Computational methods and a process are developed to obtain confidence-based reliability assessment. The insufficient input and output test data induce uncertainties in input distribution models and output distributions, respectively. These uncertainties, which arise from lack of knowledge – the insufficient test data, are different from the inherent input distributions and corresponding output variabilities, which are natural randomness of the physical system.
|
23 |
Métamodèles adaptatifs pour l'optimisation fiable multi-prestations de la masse de véhicules / Adaptive surrogate models for the reliable lightweight design of automotive body structuresMoustapha, Maliki 27 January 2016 (has links)
Cette thèse s’inscrit dans le cadre des travaux menés par PSA Peugeot Citroën pour l’allègement de ses véhicules. Les optimisations masse multi-prestations réalisées sur le périmètre de la structure contribuent directement à cette démarche en recherchant une allocation d’épaisseurs de tôles à masse minimale qui respectent des spécifications physiques relatives à différentes prestations (choc, vibro-acoustique, etc.). Ces spécifications sont généralement évaluées à travers des modèles numériques à très haute-fidélité qui présentent des temps de restitution particulièrement élevés. Le recours à des fonctions de substitution, connues sous le nom de métamodèles, reste alors la seule alternative pour mener une étude d’optimisation tout en respectant les délais projet. Cependant la prestation qui nous intéresse, à savoir le choc frontal, présente quelques particularités (grande dimensionnalité, fortes non-linéarités, dispersions physique et numérique) qui rendent sa métamodélisation difficile.L’objectif de la thèse est alors de proposer une approche d’optimisation basée sur des métamodèles adaptatifs afin de dégager de nouveaux gains de masse. Cela passe par la prise en compte du choc frontal dont le caractère chaotique est exacerbé par la présence d’incertitudes. Nous proposons ainsi une méthode d’optimisation fiabiliste avec l’introduction de quantiles comme mesure de conservatisme. L’approche est basée sur des modèles de krigeage avec enrichissement adaptatif afin de réduire au mieux le nombre d’appels aux modèles éléments finis. Une application sur un véhicule complet permet de valider la méthode. / One of the most challenging tasks in modern engineering is that of keeping the cost of manufactured goods small. With the advent of computational design, prototyping for instance, a major source of expenses, is reduced to its bare essentials. In fact, through the use of high-fidelity models, engineers can predict the behaviors of the systems they design quite faithfully. To be fully realistic, such models must embed uncertainties that may affect the physical properties or operating conditions of the system. This PhD thesis deals with the constrained optimization of structures under uncertainties in the context of automotive design. The constraints are assessed through expensive finite element models. For practical purposes, such models are conveniently substituted by so-called surrogate models which stand as cheap and easy-to-evaluate proxies. In this PhD thesis, Gaussian process modeling and support vector machines are considered. Upon reviewing state-of-the-art techniques for optimization under uncertainties, we propose a novel formulation for reliability-based design optimization which relies on quantiles. The formal equivalence of this formulation with the traditional ones is proved. This approach is then coupled to surrogate modeling. Kriging is considered thanks to its built-in error estimate which makes it convenient to adaptive sampling strategies. Such an approach allows us to reduce the computational budget by running the true model only in regions that are of interest to optimization. We therefore propose a two-stage enrichment scheme. The first stage is aimed at globally reducing the Kriging epistemic uncertainty in the vicinity of the limit-state surface. The second one is performed within iterations of optimization so as to locally improve the quantile accuracy. The efficiency of this approach is demonstrated through comparison with benchmark results. An industrial application featuring a car under frontal impact is considered. The crash behavior of a car is indeed particularly affected by uncertainties. The proposed approach therefore allows us to find a reliable solution within a reduced number of calls to the true finite element model. For the extreme case where uncertainties trigger various crash scenarios of the car, it is proposed to rely on support vector machines for classification so as to predict the possible scenarios before metamodeling each of them separately.
|
24 |
Reliability-Based Formulations for Simulation-Based Control Co-DesignSherbaf Behtash, Mohammad 23 August 2022 (has links)
No description available.
|
25 |
On reliability-based design of rock tunnel supportBjureland, William January 2017 (has links)
Tunneling involves large uncertainties. Since 2009, design of rock tunnels in European countries should be performed in accordance with the Eurocodes. The main principle in the Eurocodes is that it must be shown in all design situations that no relevant limit state is exceeded. This can be achieved with a number of different methods, where the most common one is design by calculation. To account for uncertainties in design, the Eurocode states that design by calculation should primarily be performed using limit state design methods, i.e. the partial factor method or reliability-based methods. The basic principle of the former is that it shall be assured that a structure’s resisting capacity is larger than the load acting on the structure, with high enough probability. Even if this might seem straightforward, the practical application of limit state design to rock tunnel support has only been studied to a limited extent. The aim of this licentiate thesis is to provide a review of the practical applicability of using reliability-based methods and the partial factor method in design of rock tunnel support. The review and the following discussion are based on findings from the cases studied in the appended papers. The discussion focuses on the challenges of applying fixed partial factors, as suggested by Eurocode, in design of rock tunnel support and some of the practical difficulties the engineer is faced with when applying reliability-based methods to design rock tunnel support. The main conclusions are that the partial factor method (as defined in Eurocode) is not suitable to use in design of rock tunnel support, but that reliability-based methods have the potential to account for uncertainties present in design, especially when used within the framework of the observational method. However, gathering of data for statistical quantification of input variables along with clarification of the necessary reliability levels and definition of “failure” are needed. / <p>QC 20170407</p>
|
26 |
Risk Quantification and Reliability Based Design Optimization in Reusable Launch VehiclesKing, Jason Maxwell 01 December 2010 (has links)
No description available.
|
27 |
Analysis of steep sided landfill lining systemsFowmes, Gary John January 2007 (has links)
The EC Landfill Directive (1999), which is enforced in England and Wales through the Landfill (England and Wales) Regulations (2002), has increased the technical challenge associated with the design and construction of landfill containment systems, in particular those on steep side slopes. Increased numbers of lining system components, varied configurations, and complex loading scenarios require advanced analysis tools to facilitate design. This project involved the development of advanced numerical modelling techniques, based on the FLAC finite difference modelling code. The analysis toolbox can be used to predict the behaviour of multilayered geosynthetic and soil lining systems, during and after staged construction. The model can include non-linear interface and geosynthetic axial properties, represent complex loading, including downdrag from the waste mass, whilst retaining the flexibility to represent varied geometries and include engineered support structures. Whilst numerical modelling is becoming increasingly commonplace in commercial design, there is little evidence of the validation of numerical models with field or experimental data. Validation of the analysis toolbox described in this document was conducted by back analysis of published data, modelling of landfill failure mechanisms, and comparisons to large scale laboratory testing. Design of field scale instrumentation has also been carried out as part of this project. The influence of interface shear strength variability has been assessed through the compilation of a comprehensive database, and the effect of this variability on lining system behaviour assessed through reliability based analyses. This has shown probability of failures may be higher than proposed limiting values when adopting traditional accepted factors of safety. A key area of interest identified during the project was the requirement for support, potentially through reinforcement, of the geological barrier. The inclusion of randomly reinforced fibres in bentonite enhanced soil has shown the potential for increased strength, without adverse effects on hydraulic barrier performance. ii Additionally, the influence of geomembrane seams on lining system integrity has been investigated, showing that fusion welded seams can result in stress concentration and extruded seams can cause significant stress concentration.
|
28 |
Entwurf von Textilbetonverstärkungen – computerorientierte Methoden mit verallgemeinerten UnschärfemodellenSickert, Jan-Uwe, Graf, Wolfgang, Pannier, Stephan 03 June 2009 (has links) (PDF)
Im Beitrag werden drei Methoden für den Entwurf und die Bemessung von Textilbetonverstärkungen vorgestellt. Für eine Vorbemessung wird die Variantenuntersuchung angewendet, z.B. für die Bestimmung der Anzahl an Textillagen. Für die Festlegung von Realisierungen mehrerer kontinuierlicher Entwurfsvariablen unter Berücksichtigung unterschiedlicher Entwurfsziele und Entwurfsnebenbedingungen werden die Fuzzy-Optimierung und die direkte Lösung der Entwurfsaufgabe skizziert. Mit der Fuzzy-Optimierung werden Kompromisslösungen für die multikriterielle Entwurfsaufgabe ermittelt. Die direkte Lösung basiert auf der explorativen Datenanalyse von Punktmengen, die als Ergebnis einer unscharfen Tragwerksanalyse vorliegen, und liefert Bereiche – sog. Entwurfsteilräume – als Grundlage für die Auswahl des Entwurfs.
|
29 |
Entwurf von Textilbetonverstärkungen – computerorientierte Methoden mit verallgemeinerten UnschärfemodellenSickert, Jan-Uwe, Graf, Wolfgang, Pannier, Stephan 03 June 2009 (has links)
Im Beitrag werden drei Methoden für den Entwurf und die Bemessung von Textilbetonverstärkungen vorgestellt. Für eine Vorbemessung wird die Variantenuntersuchung angewendet, z.B. für die Bestimmung der Anzahl an Textillagen. Für die Festlegung von Realisierungen mehrerer kontinuierlicher Entwurfsvariablen unter Berücksichtigung unterschiedlicher Entwurfsziele und Entwurfsnebenbedingungen werden die Fuzzy-Optimierung und die direkte Lösung der Entwurfsaufgabe skizziert. Mit der Fuzzy-Optimierung werden Kompromisslösungen für die multikriterielle Entwurfsaufgabe ermittelt. Die direkte Lösung basiert auf der explorativen Datenanalyse von Punktmengen, die als Ergebnis einer unscharfen Tragwerksanalyse vorliegen, und liefert Bereiche – sog. Entwurfsteilräume – als Grundlage für die Auswahl des Entwurfs.
|
30 |
Performance Based Design of Deep Foundations in Spatially Varying SoilsFan, Haijian January 2013 (has links)
No description available.
|
Page generated in 0.1033 seconds