• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 1
  • Tagged with
  • 10
  • 10
  • 9
  • 9
  • 6
  • 5
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Probabilistic Analysis of a Thin-walled Beam with a Crack

Kunaporn, Chalitphan 18 February 2011 (has links)
It is reasonable to assume that an aircraft might experience some in-flight discrete source damage caused by various incidents. It is, thus, necessary to evaluate the impact of such damage on the performance of the aircraft. This study is focused on evaluating the effect of a simple discrete damage in an aircraft wing on its static and dynamic response. The damaged wing is modeled by a thin-walled beam with a longitudinal crack the response of which can be obtained analytically. As uncertainties are present in the location and size of the crack as well as in the applied loads, their effects are incorporated into the framework consisting of structural response, crack propagation and aeroelasticity. The first objective of this study is to examine the effect of damage represented by a crack on the wing flexibility that influences its deformation and aero-elastic divergence characteristics. To study this, the thin-walled beam is modeled by Benscoter thin-walled beam theory combined with Gunnlaugsson and Pedersen compatibility conditions to accurately account for the discontinuity at the interface of the cracked and uncracked beam segments. Instead of conducting a detailed finite element analysis, the solution is obtained in an exact sense for general distributed loads representing the wind pressure effects. This analytical approach is shown to provide very accurate values for the global beam response compared with the detailed finite element shell analysis. This analytical solution is, then, used to study the beam response probabilistically. The crack location and size are assumed to be uncertain and are, thus, characterized by random variable. For a specified limit state, the probability of failure can be conveniently calculated by the first order second moment analysis using the safety index approach. The same analytical solution is also used to study the aero-elastic divergence characteristics of a wing, the inner structure of which is represented by a thin-walled beam with a crack of uncertain size and position along the beam. The second objective of this study is to examine the time growth of a crack under dynamic gust type of loading to which a wing is likely to be exposed during flight. Damage propagating during operation further deteriorates the safety of the aircraft and it is necessary to study its time growth so that its impact on the performance can be evaluated before it reaches its unstable state. The proposed framework for the crack growth analysis is based on classical fracture mechanics where the remaining flight time is obtained by Monte Carlo simulation in which various uncertainties are taken into account. To obtain equivalent cyclic loading required for crack growth analysis, random vibration analysis of the thin-walled beam is conducted for stochastic wind load defined by a gust load spectral density function. The probability of failure represented by the crack size approaching the critical crack size within the flight duration or the remaining flight time before a crack reaches its limiting value are obtained. This study with a simple representation of a wing and damage is anticipated to provide initial guidance for future studies to examine the impact of discrete source damage on the in-flight performance of the aircrafts, with the ultimate goal of minimizing the adverse effect and enhancing the safety of aircrafts experiencing damage. / Ph. D.
2

Reliability-Based Topology Optimization with Analytic Sensitivities

Clark, Patrick Ryan 03 August 2017 (has links)
It is a common practice when designing a system to apply safety factors to the critical failure load or event. These safety factors provide a buffer against failure due to the random or un-modeled behavior, which may lead the system to exceed these limits. However these safety factors are not directly related to the likelihood of a failure event occurring. If the safety factors are poorly chosen, the system may fail unexpectedly or it may have a design which is too conservative. Reliability-Based Design Optimization (RBDO) is an alternative approach which directly considers the likelihood of failure by incorporating a reliability analysis step such as the First-Order Reliability Method (FORM). The FORM analysis requires the solution of an optimization problem however, so implementing this approach into an RBDO routine creates a double-loop optimization structure. For large problems such as Reliability-Based Topology Optimization (RBTO), numeric sensitivity analysis becomes computationally intractable. In this thesis, a general approach to the sensitivity analysis of nested functions is developed from the Lagrange Multiplier Theorem and then applied to several Reliability-Based Design Optimization problems, including topology optimization. The proposed approach is computationally efficient, requiring only a single solution of the FORM problem each iteration. / Master of Science
3

Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading Components

Seecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability. Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime. The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated. This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
4

Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading Components

Seecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability. Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime. The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated. This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
5

Probabilistic Post-Liquefaction Residual Shear Strength Analyses of Cohesionless Soil Deposits: Application to the Kocaeli (1999) and Duzce (1999) Earthquakes

Lumbantoruan, Partahi Mamora Halomoan 31 October 2005 (has links)
Liquefaction of granular soil deposits can have extremely detrimental effects on the stability of embankment dams, natural soil slopes, and mine tailings. The residual or liquefied shear strength of the liquefiable soils is a very important parameter when evaluating stability and deformation of level and sloping ground. Current procedures for estimating the liquefied shear strength are based on extensive laboratory testing programs or from the back-analysis of failures where liquefaction was involved and in-situ testing data was available. All available procedures utilize deterministic methods for estimation and selection of the liquefied shear strength. Over the past decade, there has been an increasing trend towards analyzing geotechnical problems using probability and reliability. This study presents procedures for assessing the liquefied shear strength of cohesionless soil deposits within a risk-based framework. Probabilistic slope stability procedures using reliability methods and Monte Carlo Simulations are developed to incorporate uncertainties associated with geometrical and material parameters. The probabilistic methods are applied to flow liquefaction case histories from the 1999 Kocaeli/Duzce, Turkey Earthquake, where extensive liquefaction was observed. The methods presented in this paper should aid in making better decisions about the design and rehabilitation of structures constructed of or atop liquefiable soil deposits. / Master of Science
6

Limit and shakedown analysis of plates and shells including uncertainties

Trần, Thanh Ngọc 15 April 2008 (has links) (PDF)
The reliability analysis of plates and shells with respect to plastic collapse or to inadaptation is formulated on the basis of limit and shakedown theorems. The loading, the material strength and the shell thickness are considered as random variables. Based on a direct definition of the limit state function, the nonlinear problems may be efficiently solved by using the First and Second Order Reliability Methods (FORM/SORM). The sensitivity analyses in FORM/SORM can be based on the sensitivities of the deterministic shakedown problem. The problem of reliability of structural systems is also handled by the application of a special barrier technique which permits to find all the design points corresponding to all the failure modes. The direct plasticity approach reduces considerably the necessary knowledge of uncertain input data, computing costs and the numerical error. / Die Zuverlässigkeitsanalyse von Platten und Schalen in Bezug auf plastischen Kollaps oder Nicht-Anpassung wird mit den Traglast- und Einspielsätzen formuliert. Die Lasten, die Werkstofffestigkeit und die Schalendicke werden als Zufallsvariablen betrachtet. Auf der Grundlage einer direkten Definition der Grenzzustandsfunktion kann die Berechnung der Versagenswahrscheinlichkeit effektiv mit den Zuverlässigkeitsmethoden erster und zweiter Ordnung (FROM/SORM) gelöst werden. Die Sensitivitätsanalysen in FORM/SORM lassen sich auf der Basis der Sensitivitäten des deterministischen Einspielproblems berechnen. Die Schwierigkeiten bei der Ermittlung der Zuverlässigkeit von strukturellen Systemen werden durch Anwendung einer speziellen Barrieremethode behoben, die es erlaubt, alle Auslegungspunkte zu allen Versagensmoden zu finden. Die Anwendung direkter Plastizitätsmethoden führt zu einer beträchtlichen Verringerung der notwendigen Kenntnis der unsicheren Eingangsdaten, des Berechnungsaufwandes und der numerischen Fehler.
7

Experimental Testing and Reliability Analysis of Repaired SMA and Steel Reinforced Shear Walls

Zaidi, Mohammed January 2016 (has links)
Superelastic Shape Memory Alloys (SMAs) are being explored as alternative reinforcing materials to traditional deformed steel reinforcement for seismic applications. The main advantage is the ability of the SMA to recover large nonlinear strains, which promotes the self-centering phenomenon. The primary objective of this research is to present the performance, before and after repair, of slender reinforced concrete shear walls, one reinforced internally with SMAs in the boundary zones within the plastic hinge region and other control wall reinforced with conventional steel only. The repair procedure included removal of damaged concrete within the plastic hinge region, replacing fractured and buckled reinforcement, followed by shortening of the SMA reinforcement in the boundary zones of SMA wall. The removed concrete was replaced with self-consolidating concrete, while the concrete above the plastic hinge region remained intact. The SMA reinforced concrete shear wall (before and after repair) exhibited stable hysteretic response with significant strength, and displacement and energy dissipation capacities. In addition, the walls exhibited pinching in the hysteretic response as a result of minimizing the residual displacements due to the restoring capacity of the SMA reinforcement. The results demonstrate that SMA reinforced components are self-centering, permitting repairing of damaged areas. Furthermore, the SMA reinforcement is re-usable given its capacity to reset to its original state. The length of the SMA bars in the original and repaired wall, in addition to the presence of starter bars in the original wall, were significant factors in the location of failure of the walls. The conventional steel wall prior to repair was unstable due to large residual displacements experienced during the original test. After repair the wall exhibited ratcheting in hysteretic response but with significant strength. The conventional wall, before and after repair, dissipated more energy than the SMA wall. This was the result of the wider hysteretic loops with reduced punching, but at the cost of large residual displacements. The starter bars in the conventional wall before repair controlled the location of failure, while the presence of couplers in the plastic hinge region was the main factor in determining the failure location in the repaired conventional wall.
8

Reliability Assessment and Probabilistic Optimization in Structural Design

Mansour, Rami January 2016 (has links)
Research in the field of reliability based design is mainly focused on two sub-areas: The computation of the probability of failure and its integration in the reliability based design optimization (RBDO) loop. Four papers are presented in this work, representing a contribution to both sub-areas. In the first paper, a new Second Order Reliability Method (SORM) is presented. As opposed to the most commonly used SORMs, the presented approach is not limited to hyper-parabolic approximation of the performance function at the Most Probable Point (MPP) of failure. Instead, a full quadratic fit is used leading to a better approximation of the real performance function and therefore more accurate values of the probability of failure. The second paper focuses on the integration of the expression for the probability of failure for general quadratic function, presented in the first paper, in RBDO. One important feature of the proposed approach is that it does not involve locating the MPP. In the third paper, the expressions for the probability of failure based on general quadratic limit-state functions presented in the first paper are applied for the special case of a hyper-parabola. The expression is reformulated and simplified so that the probability of failure is only a function of three statistical measures: the Cornell reliability index, the skewness and the kurtosis of the hyper-parabola. These statistical measures are functions of the First-Order Reliability Index and the curvatures at the MPP. In the last paper, an approximate and efficient reliability method is proposed. Focus is on computational efficiency as well as intuitiveness for practicing engineers, especially regarding probabilistic fatigue problems where volume methods are used. The number of function evaluations to compute the probability of failure of the design under different types of uncertainties is a priori known to be 3n+2 in the proposed method, where n is the number of stochastic design variables. / <p>QC 20160317</p>
9

A Probabilistic Decision Support System for a Performance-Based Design of Infrastructures

Shahtaheri, Yasaman 20 August 2018 (has links)
Infrastructures are the most fundamental facilities and systems serving the society. Due to the existence of infrastructures in economic, social, and environmental contexts, all lifecycle phases of such fundamental facilities should maximize utility for the designers, occupants, and the society. With respect to the nature of the decision problem, two main types of uncertainties may exist: 1) the aleatory uncertainty associated with the nature of the built environment (i.e., the economic, social, and environmental impacts of infrastructures must be described as probabilistic); and 2) the epistemic uncertainty associated with the lack of knowledge of decision maker utilities. Although a number of decision analysis models exist that consider the uncertainty associated with the nature of the built environment, they do not provide a systematic framework for including aleatory and epistemic uncertainties, and decision maker utilities in the decision analysis process. In order to address the identified knowledge gap, a three-phase modular decision analysis methodology is proposed. Module one uses a formal preference assessment methodology (i.e., utility function/indifference curve) for assessing decision maker utility functions with respect to a range of alternative design configurations. Module two utilizes the First Order Reliability Method (FORM) in a systems reliability approach for assessing the reliability of alternative infrastructure design configurations with respect to the probabilistic decision criteria and decision maker defined utility functions (indifference curves), and provides a meaningful feedback loop for improving the reliability of the alternative design configurations. Module three provides a systematic framework to incorporate both aleatory and epistemic uncertainties in the decision analysis methodology (i.e., uncertain utility functions and group decision making). The multi-criteria, probabilistic decision analysis framework is tested on a nine-story office building in a seismic zone with the probabilistic decision criteria of: building damage and business interruption costs, casualty costs, and CO2 emission costs. Twelve alternative design configurations and four decision maker utility functions under aleatory and epistemic uncertainties are utilized. The results of the decision analysis methodology revealed that the high-performing design configurations with an initial cost of up to $3.2M (in a cost range between $1.7M and $3.2M), a building damage and business interruption cost as low as $303K (in a cost range between $303K and $6.2M), a casualty cost as low as $43K (in a cost range between $43K and $1.2M), and a CO2 emission as low as $146K (in a cost range between $133K to $150K) can be identified by having a higher probability (i.e., up to 80%) of meeting the decision makers' preferences. The modular, holistic, decision analysis framework allows decision makers to make more informed performance-based design decisions—and allows designers to better incorporate the preferences of the decision makers—during the early design process. / PHD / Infrastructures, including buildings, roads, and bridges, are the most fundamental facilities and systems serving the society. Because infrastructures exist in economic, social, and environmental contexts, the design, construction, operations, and maintenance phases of such fundamental facilities should maximize value and usability for the designers, occupants, and the society. Identifying infrastructure configurations that maximize value and usability is challenged by two sources of uncertainty: 1) the nature of the built environment is variable (i.e., whether or not a natural hazard will occur during the infrastructure lifetime, or how costs might change over time); and 2) there is lack of knowledge of decision maker preferences and values (e.g., design cost versus social impact tradeoffs). Although a number of decision analysis models exist that consider the uncertainty associated with the nature of the built environment (e.g., natural hazard events), they do not provide a systematic framework for including the uncertainties associated with the decision analysis process (e.g., lack of knowledge about decision maker preferences), and decision maker requirements in the decision analysis process. In order to address the identified knowledge gap, a three-phase modular decision analysis methodology is proposed. Module one uses a formal preference assessment methodology for assessing decision maker values with respect to a range of alternative design configurations. Module two utilizes an algorithm for assessing the reliability of alternative infrastructure design configurations with respect to the probabilistic decision criteria and decision maker requirements, and provides a meaningful feedback loop for understanding the decision analysis results (i.e., improving the value and usability of the alternative design configurations). Module three provides a systematic framework to incorporate both the random uncertainty associated with the built environment and the knowledge uncertainty associated with lack of knowledge of decision maker preferences, and tests the reliability of the decision analysis results under random and knowledge uncertainties (i.e., uncertain decision maker preferences and group decision making). The holistic decision analysis framework is tested on a nine-story office building in a seismic zone with the probabilistic decision criteria of: building damage and business interruption costs, casualty costs, and CO2 emission costs. Twelve alternative design configurations, four decision makers, and random and knowledge sources of uncertainty are considered in the decision analysis methodology. Results indicate that the modular, holistic, decision analysis framework allows decision makers to make more informed design decisions—and allows designers to better incorporate the preferences of the decision makers—during the early design process.
10

Limit and shakedown analysis of plates and shells including uncertainties

Trần, Thanh Ngọc 12 March 2008 (has links)
The reliability analysis of plates and shells with respect to plastic collapse or to inadaptation is formulated on the basis of limit and shakedown theorems. The loading, the material strength and the shell thickness are considered as random variables. Based on a direct definition of the limit state function, the nonlinear problems may be efficiently solved by using the First and Second Order Reliability Methods (FORM/SORM). The sensitivity analyses in FORM/SORM can be based on the sensitivities of the deterministic shakedown problem. The problem of reliability of structural systems is also handled by the application of a special barrier technique which permits to find all the design points corresponding to all the failure modes. The direct plasticity approach reduces considerably the necessary knowledge of uncertain input data, computing costs and the numerical error. / Die Zuverlässigkeitsanalyse von Platten und Schalen in Bezug auf plastischen Kollaps oder Nicht-Anpassung wird mit den Traglast- und Einspielsätzen formuliert. Die Lasten, die Werkstofffestigkeit und die Schalendicke werden als Zufallsvariablen betrachtet. Auf der Grundlage einer direkten Definition der Grenzzustandsfunktion kann die Berechnung der Versagenswahrscheinlichkeit effektiv mit den Zuverlässigkeitsmethoden erster und zweiter Ordnung (FROM/SORM) gelöst werden. Die Sensitivitätsanalysen in FORM/SORM lassen sich auf der Basis der Sensitivitäten des deterministischen Einspielproblems berechnen. Die Schwierigkeiten bei der Ermittlung der Zuverlässigkeit von strukturellen Systemen werden durch Anwendung einer speziellen Barrieremethode behoben, die es erlaubt, alle Auslegungspunkte zu allen Versagensmoden zu finden. Die Anwendung direkter Plastizitätsmethoden führt zu einer beträchtlichen Verringerung der notwendigen Kenntnis der unsicheren Eingangsdaten, des Berechnungsaufwandes und der numerischen Fehler.

Page generated in 0.0952 seconds