Spelling suggestions: "subject:"first order eliability"" "subject:"first order deliability""
1 |
Reliability methods in dynamic system analysisMunoz, Brad Ernest 26 April 2013 (has links)
Standard techniques used to analyze a system's response with uncertain system parameters or inputs, are generally Importance sampling methods. Sampling methods require a large number of simulation runs before the system
output statistics can be analyzed. As model fidelity increases, sampling techniques become computationally infeasible, and Reliability methods have gained popularity as an analysis method that requires significantly fewer simulation runs. Reliability analysis is an analytic technique which finds a particular point in the design space that can accurately be related to the probability of system failure. However, application to dynamic systems have remained limited.
In the following thesis a First Order Reliability Method (FORM) is used to determine the failure probability of a dynamic system due to system/input uncertainties. A pendulum cart system is used as a case study to demonstrate the FORM on a dynamic system. Three failure modes are discussed which
correspond to the maximum pendulum angle, the maximum system velocity,
and a combined requirement that neither the maximum pendulum angle or system velocity are exceeded. An explicit formulation is generated from the implicit formulation using a Response Surface Methodology, and the FORM is performed using the explicit estimate. Although the analysis converges with minimal simulation computations, attempts to verify FORM results illuminate current limitations of the methodology. The results of this initial study conclude that, currently, sampling techniques are necessary to verify the FORM results, which restricts the potential applications of the FORM methodology. Suggested future work focuses on result verification without the use of Importance sampling which would allow Reliability methods to have widespread applicability. / text
|
2 |
Reliability-Based Topology Optimization with Analytic SensitivitiesClark, Patrick Ryan 03 August 2017 (has links)
It is a common practice when designing a system to apply safety factors to the critical failure load or event. These safety factors provide a buffer against failure due to the random or un-modeled behavior, which may lead the system to exceed these limits. However these safety factors are not directly related to the likelihood of a failure event occurring. If the safety factors are poorly chosen, the system may fail unexpectedly or it may have a design which is too conservative. Reliability-Based Design Optimization (RBDO) is an alternative approach which directly considers the likelihood of failure by incorporating a reliability analysis step such as the First-Order Reliability Method (FORM). The FORM analysis requires the solution of an optimization problem however, so implementing this approach into an RBDO routine creates a double-loop optimization structure. For large problems such as Reliability-Based Topology Optimization (RBTO), numeric sensitivity analysis becomes computationally intractable. In this thesis, a general approach to the sensitivity analysis of nested functions is developed from the Lagrange Multiplier Theorem and then applied to several Reliability-Based Design Optimization problems, including topology optimization. The proposed approach is computationally efficient, requiring only a single solution of the FORM problem each iteration. / Master of Science / It is a common practice when designing a system to apply safety factors to the critical failure load or event. These safety factors provide a buffer against failure due to the random or unmodeled behavior, which may lead the system to exceed these limits. However these safety factors are not directly related to the likelihood of a failure event occurring. If the safety factors are poorly chosen, the system may fail unexpectedly or it may have a design which is too conservative. Reliability-Based Design Optimization (RBDO) is an alternative approach which directly considers the likelihood of failure by incorporating a reliability analysis step such as the First-Order Reliability Method (FORM). The FORM analysis requires the solution of an optimization problem however, so implementing this approach into an RBDO routine creates a double-loop optimization structure. For large problems such as Reliability-Based Topology Optimization (RBTO), numeric sensitivity analysis becomes computationally intractable. In this thesis, a general approach to the sensitivity analysis of nested functions is developed from the Lagrange Multiplier Theorem and then applied to several Reliability-Based Design Optimization problems, including topology optimization. The proposed approach is computationally efficient, requiring only a single solution of the FORM problem each iteration.
|
3 |
Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading ComponentsSeecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability.
Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime.
The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated.
This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
|
4 |
Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading ComponentsSeecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability.
Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime.
The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated.
This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
|
5 |
Probabilistic Post-Liquefaction Residual Shear Strength Analyses of Cohesionless Soil Deposits: Application to the Kocaeli (1999) and Duzce (1999) EarthquakesLumbantoruan, Partahi Mamora Halomoan 31 October 2005 (has links)
Liquefaction of granular soil deposits can have extremely detrimental effects on the stability of embankment dams, natural soil slopes, and mine tailings. The residual or liquefied shear strength of the liquefiable soils is a very important parameter when evaluating stability and deformation of level and sloping ground. Current procedures for estimating the liquefied shear strength are based on extensive laboratory testing programs or from the back-analysis of failures where liquefaction was involved and in-situ testing data was available. All available procedures utilize deterministic methods for estimation and selection of the liquefied shear strength. Over the past decade, there has been an increasing trend towards analyzing geotechnical problems using probability and reliability. This study presents procedures for assessing the liquefied shear strength of cohesionless soil deposits within a risk-based framework. Probabilistic slope stability procedures using reliability methods and Monte Carlo Simulations are developed to incorporate uncertainties associated with geometrical and material parameters. The probabilistic methods are applied to flow liquefaction case histories from the 1999 Kocaeli/Duzce, Turkey Earthquake, where extensive liquefaction was observed. The methods presented in this paper should aid in making better decisions about the design and rehabilitation of structures constructed of or atop liquefiable soil deposits. / Master of Science
|
6 |
Experimental Testing and Reliability Analysis of Repaired SMA and Steel Reinforced Shear WallsZaidi, Mohammed January 2016 (has links)
Superelastic Shape Memory Alloys (SMAs) are being explored as alternative reinforcing materials to traditional deformed steel reinforcement for seismic applications. The main advantage is the ability of the SMA to recover large nonlinear strains, which promotes the self-centering phenomenon. The primary objective of this research is to present the performance, before and after repair, of slender reinforced concrete shear walls, one reinforced internally with SMAs in the boundary zones within the plastic hinge region and other control wall reinforced with conventional steel only. The repair procedure included removal of damaged concrete within the plastic hinge region, replacing fractured and buckled reinforcement, followed by shortening of the SMA reinforcement in the boundary zones of SMA wall. The removed concrete was replaced with self-consolidating concrete, while the concrete above the plastic hinge region remained intact.
The SMA reinforced concrete shear wall (before and after repair) exhibited stable hysteretic response with significant strength, and displacement and energy dissipation capacities. In addition, the walls exhibited pinching in the hysteretic response as a result of minimizing the residual displacements due to the restoring capacity of the SMA reinforcement. The results demonstrate that SMA reinforced components are self-centering, permitting repairing of damaged areas. Furthermore, the SMA reinforcement is re-usable given its capacity to reset to its original state. The length of the SMA bars in the original and repaired wall, in addition to the presence of starter bars in the original wall, were significant factors in the location of failure of the walls.
The conventional steel wall prior to repair was unstable due to large residual displacements experienced during the original test. After repair the wall exhibited ratcheting in hysteretic response but with significant strength. The conventional wall, before and after repair, dissipated more energy than the SMA wall. This was the result of the wider hysteretic loops with reduced punching, but at the cost of large residual displacements. The starter bars in the conventional wall before repair controlled the location of failure, while the presence of couplers in the plastic hinge region was the main factor in determining the failure location in the repaired conventional wall.
|
7 |
A Probabilistic Decision Support System for a Performance-Based Design of InfrastructuresShahtaheri, Yasaman 20 August 2018 (has links)
Infrastructures are the most fundamental facilities and systems serving the society. Due to the existence of infrastructures in economic, social, and environmental contexts, all lifecycle phases of such fundamental facilities should maximize utility for the designers, occupants, and the society. With respect to the nature of the decision problem, two main types of uncertainties may exist: 1) the aleatory uncertainty associated with the nature of the built environment (i.e., the economic, social, and environmental impacts of infrastructures must be described as probabilistic); and 2) the epistemic uncertainty associated with the lack of knowledge of decision maker utilities. Although a number of decision analysis models exist that consider the uncertainty associated with the nature of the built environment, they do not provide a systematic framework for including aleatory and epistemic uncertainties, and decision maker utilities in the decision analysis process. In order to address the identified knowledge gap, a three-phase modular decision analysis methodology is proposed. Module one uses a formal preference assessment methodology (i.e., utility function/indifference curve) for assessing decision maker utility functions with respect to a range of alternative design configurations. Module two utilizes the First Order Reliability Method (FORM) in a systems reliability approach for assessing the reliability of alternative infrastructure design configurations with respect to the probabilistic decision criteria and decision maker defined utility functions (indifference curves), and provides a meaningful feedback loop for improving the reliability of the alternative design configurations. Module three provides a systematic framework to incorporate both aleatory and epistemic uncertainties in the decision analysis methodology (i.e., uncertain utility functions and group decision making). The multi-criteria, probabilistic decision analysis framework is tested on a nine-story office building in a seismic zone with the probabilistic decision criteria of: building damage and business interruption costs, casualty costs, and CO2 emission costs. Twelve alternative design configurations and four decision maker utility functions under aleatory and epistemic uncertainties are utilized. The results of the decision analysis methodology revealed that the high-performing design configurations with an initial cost of up to $3.2M (in a cost range between $1.7M and $3.2M), a building damage and business interruption cost as low as $303K (in a cost range between $303K and $6.2M), a casualty cost as low as $43K (in a cost range between $43K and $1.2M), and a CO2 emission as low as $146K (in a cost range between $133K to $150K) can be identified by having a higher probability (i.e., up to 80%) of meeting the decision makers' preferences. The modular, holistic, decision analysis framework allows decision makers to make more informed performance-based design decisions—and allows designers to better incorporate the preferences of the decision makers—during the early design process. / PHD / Infrastructures, including buildings, roads, and bridges, are the most fundamental facilities and systems serving the society. Because infrastructures exist in economic, social, and environmental contexts, the design, construction, operations, and maintenance phases of such fundamental facilities should maximize value and usability for the designers, occupants, and the society. Identifying infrastructure configurations that maximize value and usability is challenged by two sources of uncertainty: 1) the nature of the built environment is variable (i.e., whether or not a natural hazard will occur during the infrastructure lifetime, or how costs might change over time); and 2) there is lack of knowledge of decision maker preferences and values (e.g., design cost versus social impact tradeoffs). Although a number of decision analysis models exist that consider the uncertainty associated with the nature of the built environment (e.g., natural hazard events), they do not provide a systematic framework for including the uncertainties associated with the decision analysis process (e.g., lack of knowledge about decision maker preferences), and decision maker requirements in the decision analysis process. In order to address the identified knowledge gap, a three-phase modular decision analysis methodology is proposed. Module one uses a formal preference assessment methodology for assessing decision maker values with respect to a range of alternative design configurations. Module two utilizes an algorithm for assessing the reliability of alternative infrastructure design configurations with respect to the probabilistic decision criteria and decision maker requirements, and provides a meaningful feedback loop for understanding the decision analysis results (i.e., improving the value and usability of the alternative design configurations). Module three provides a systematic framework to incorporate both the random uncertainty associated with the built environment and the knowledge uncertainty associated with lack of knowledge of decision maker preferences, and tests the reliability of the decision analysis results under random and knowledge uncertainties (i.e., uncertain decision maker preferences and group decision making). The holistic decision analysis framework is tested on a nine-story office building in a seismic zone with the probabilistic decision criteria of: building damage and business interruption costs, casualty costs, and CO2 emission costs. Twelve alternative design configurations, four decision makers, and random and knowledge sources of uncertainty are considered in the decision analysis methodology. Results indicate that the modular, holistic, decision analysis framework allows decision makers to make more informed design decisions—and allows designers to better incorporate the preferences of the decision makers—during the early design process.
|
8 |
Limit and shakedown analysis of plates and shells including uncertaintiesTrần, Thanh Ngọc 15 April 2008 (has links) (PDF)
The reliability analysis of plates and shells with respect to plastic collapse or to inadaptation is formulated on the basis of limit and shakedown theorems. The loading, the material strength and the shell thickness are considered as random variables. Based on a direct definition of the limit state function, the nonlinear problems may be efficiently solved by using the First and Second Order Reliability Methods (FORM/SORM). The sensitivity analyses in FORM/SORM can be based on the sensitivities of the deterministic shakedown problem. The problem of reliability of structural systems is also handled by the application of a special barrier technique which permits to find all the design points corresponding to all the failure modes. The direct plasticity approach reduces considerably the necessary knowledge of uncertain input data, computing costs and the numerical error. / Die Zuverlässigkeitsanalyse von Platten und Schalen in Bezug auf plastischen Kollaps oder Nicht-Anpassung wird mit den Traglast- und Einspielsätzen formuliert. Die Lasten, die Werkstofffestigkeit und die Schalendicke werden als Zufallsvariablen betrachtet. Auf der Grundlage einer direkten Definition der Grenzzustandsfunktion kann die Berechnung der Versagenswahrscheinlichkeit effektiv mit den Zuverlässigkeitsmethoden erster und zweiter Ordnung (FROM/SORM) gelöst werden. Die Sensitivitätsanalysen in FORM/SORM lassen sich auf der Basis der Sensitivitäten des deterministischen Einspielproblems berechnen. Die Schwierigkeiten bei der Ermittlung der Zuverlässigkeit von strukturellen Systemen werden durch Anwendung einer speziellen Barrieremethode behoben, die es erlaubt, alle Auslegungspunkte zu allen Versagensmoden zu finden. Die Anwendung direkter Plastizitätsmethoden führt zu einer beträchtlichen Verringerung der notwendigen Kenntnis der unsicheren Eingangsdaten, des Berechnungsaufwandes und der numerischen Fehler.
|
9 |
Limit and shakedown analysis of plates and shells including uncertaintiesTrần, Thanh Ngọc 12 March 2008 (has links)
The reliability analysis of plates and shells with respect to plastic collapse or to inadaptation is formulated on the basis of limit and shakedown theorems. The loading, the material strength and the shell thickness are considered as random variables. Based on a direct definition of the limit state function, the nonlinear problems may be efficiently solved by using the First and Second Order Reliability Methods (FORM/SORM). The sensitivity analyses in FORM/SORM can be based on the sensitivities of the deterministic shakedown problem. The problem of reliability of structural systems is also handled by the application of a special barrier technique which permits to find all the design points corresponding to all the failure modes. The direct plasticity approach reduces considerably the necessary knowledge of uncertain input data, computing costs and the numerical error. / Die Zuverlässigkeitsanalyse von Platten und Schalen in Bezug auf plastischen Kollaps oder Nicht-Anpassung wird mit den Traglast- und Einspielsätzen formuliert. Die Lasten, die Werkstofffestigkeit und die Schalendicke werden als Zufallsvariablen betrachtet. Auf der Grundlage einer direkten Definition der Grenzzustandsfunktion kann die Berechnung der Versagenswahrscheinlichkeit effektiv mit den Zuverlässigkeitsmethoden erster und zweiter Ordnung (FROM/SORM) gelöst werden. Die Sensitivitätsanalysen in FORM/SORM lassen sich auf der Basis der Sensitivitäten des deterministischen Einspielproblems berechnen. Die Schwierigkeiten bei der Ermittlung der Zuverlässigkeit von strukturellen Systemen werden durch Anwendung einer speziellen Barrieremethode behoben, die es erlaubt, alle Auslegungspunkte zu allen Versagensmoden zu finden. Die Anwendung direkter Plastizitätsmethoden führt zu einer beträchtlichen Verringerung der notwendigen Kenntnis der unsicheren Eingangsdaten, des Berechnungsaufwandes und der numerischen Fehler.
|
Page generated in 0.0768 seconds