• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 2
  • 1
  • Tagged with
  • 18
  • 18
  • 8
  • 7
  • 7
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Decision Making Strategies for Probabilistic Aerospace Systems Design

Borer, Nicholas Keith 24 March 2006 (has links)
Modern aerospace systems design problems are often characterized by the necessity to identify and enable multiple tradeoffs. This can be accomplished by transformation of the design problem to a multiple objective optimization formulation. However, existing multiple criteria techniques can lead to unattractive solutions due to their basic assumptions; namely that of monotonically increasing utility and independent decision criteria. Further, it can be difficult to quantify the relative importance of each decision metric, and it is very difficult to view the pertinent tradeoffs for large-scale problems. This thesis presents a discussion and application of Multiple Criteria Decision Making (MCDM) to aerospace systems design and quantifies the complications associated with switching from single to multiple objectives. It then presents a procedure to tackle these problems by utilizing a two-part relative importance model for each criterion. This model contains a static and dynamic portion with respect to the current value of the decision metric. The static portion is selected based on an entropy analogy of each metric within the decision space to alleviate the problems associated with quantifying basic (monotonic) relative importance. This static value is further modified by examination of the interdependence of the decision metrics. The dynamic contribution uses a penalty function approach for any constraints and further reduces the importance of any metric approaching a user-specified threshold level. This reduces the impact of the assumption of monotonically increasing utility by constantly updating the relative importance of a given metric based on its current value. A method is also developed to determine a linearly independent subset of the original requirements, resulting in compact visualization techniques for large-scale problems.
2

Risk-based design of structures for fire

Al-Remal, Ahmad Mejbas January 2013 (has links)
Techniques of performance-based design in fire safety have developed notably in the past two decades. One of the reasons for departing from the prescriptive methods is the ability of performance-based methods to form a scientific basis for the cost-risk-benefit analysis of different fire safety alternatives. Apart from few exceptions, observation of past fires has shown that the structure’s contribution to the overall fire resistance was considerably underestimated. The purpose of this research is to outline a risk-based design approach for structures in fire. Probabilistic methods are employed to ascertain uniform reliability indices in line with the classical trend in code development. Modern design codes for complex phenomena such as fire have been structured to facilitate design computations. Prescriptive design methods specify fire protection methods for structural systems based on laboratory controlled and highly restrictive testing regimes. Those methods inherently assume that the tested elements behave similarly in real structures irrespective of their loading, location or boundary conditions. This approach is contested by many researchers, and analyses following fire incidents indicated alarming discrepancy between anticipated and actual structural behaviour during real fires. In formulating design and construction codes, code writers deal with the inherent uncertainties by setting a ceiling to the potential risk of failure. The latter process is implemented by specifying safety parameters, that are derived via probabilistic techniques aimed at harmonising the risks ensuing different load scenarios. The code structure addresses the probability of failure with adequate detail and accuracy. The other component of the risk metric, namely the consequence of failure, is a subjective field that assumes a multitude of variables depending on the context of the problem. In codified structural design, the severity of failure is implicitly embodied in the different magnitudes of safety indices applied to different modes of structural response. This project introduces a risk-based method for the design of structures in fire. It provides a coherent approach to a quantified treatment of risk elements that meets the demands of performance-based fire safety methods. A number of proposals are made for rational acceptable risk and reliability parameters in addition to a damage index with applications in structural fire safety design. Although the example application of the proposed damage index is a structure subjected to fire effects, the same rationale can be easily applied to the assessment of structural damage due to other effects.
3

PROBABILISTIC DESIGN AND RELIABILITY ANALYSIS WITH KRIGING AND ENVELOPE METHODS

Hao Wu (12456738) 26 April 2022 (has links)
<p> </p> <p>In the mechanical design stage, engineers always meet with uncertainty, such as random</p> <p>variables, stochastic processes, and random processes. Due to the uncertainty, products may</p> <p>behave randomly with respect to time and space, and this may result in a high probability of failure,</p> <p>low lifetime, and low robustness. Although extensive research has been conducted on the</p> <p>component reliability methods, time- and space-dependent system reliability methods are still</p> <p>limited. This dissertation is motivated by the need of efficient and accurate methods for addressing</p> <p>time- and space-dependent system reliability and probabilistic design problems.</p> <p>The objective of this dissertation is to develop efficient and accurate methods for reliability</p> <p>analysis and design. There are five research tasks for this objective. The first research task develops</p> <p>a surrogate model with an active learning method to predict the time- and space-independent</p> <p>system reliability. In the second research task, the time- and space-independent system reliability</p> <p>is estimated by the second order saddlepoint approximation method. In the third research task, the</p> <p>time-dependent system reliability is addressed by an envelope method with efficient global</p> <p>optimization. In the fourth research task, a general time- and space-dependent problem is</p> <p>investigated. The envelope method converts the time- and space-dependent problem into time- and</p> <p>space-independent one, and the second order approximation is used to predict results. The last task</p> <p>proposes a new sequential reliability-based design with the envelope method for time- and spacedependent</p> <p>reliability. The accuracy and efficiency of our proposed methods are demonstrated</p> <p>through a wide range of mathematics problems and engineering problems.</p>
4

Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading Components

Seecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability. Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime. The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated. This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
5

A Study On The Reliability Analysis During Preliminary Design - A Rocket Motor Example

Bozkaya, Kenan 01 September 2006 (has links) (PDF)
To be competitive in the market, it is very important to design cost effective and reliable products. For this purpose, it is necessary to consider reliability as an integral part of the design procedure. Therefore, reliability which is a design parameter that affects cost and safety of a system should be taken into consideration in early phases since it is very difficult to change design at the later phases. Reliability of a rocket motor can be evaluated by reliability testing but these tests are very expensive and difficult since the tests are destructive and test sample size is determined by the binomial law. Because of the difficulties in reliability testing, in early design phases reliability can be evaluated by using reliability prediction results. This thesis report includes application of probabilistic approach for a solid rocket motor design to evaluate its reliability in preliminary design phase. In this study, it is aimed to assess the solid rocket motor ballistic performance reliability and casing structural reliability, determine important parameters affective on the solid rocket motor reliability and find a new design point to improve the reliability. Variations in dimensions and material properties are considered as the sources of failures and the limit states for acceleration, total impulse and maximum stress in the casing are approximated with response surface method by considering these variations. With the response surface functions, Monte Carlo simulation is used to assess failure probability and distributions of the rocket motor performance. Besides the assessment of the reliability, capability of the response surface functions to estimate the rocket motor performance and effects of the input parameters on the rocket motor performance and performance variation are also examined. By considering the effect of the input parameters, a new design point is proposed to decrease the total probability of failure.
6

Parametric study of manifolds using finite element analysis

Bäckström, Kristoffer January 2008 (has links)
<p>Volvo Aero Corporation takes part in a development project called Future Launchers Preparatory Program (FLPP) which aims to develop Next Generation Launchers (NGL) for future space flights. FLPP involves several projects and one these are focused on the development of the next generation rocket engines for the NGL.</p><p>The environment of a rocket engine is extremely hostile, characterized by high pressure levels and rapid thermal transients. Even though the components are manufactured from super alloys, the life of these components is measured in seconds. In the light of these facts, it is obvious that all components have to be optimized to the last detail. This thesis work is a part of the optimization procedure with the objective to perform a parametric study of manifolds that will be particular useful during the concept work of the turbines for the FLPP program.</p><p>The methods of probabilistic analysis have been employed in this study. This approach involves Ishikawa analysis (Cause and Effects) as well deriving transfer functions through defining and performing simulations in a structured manner according to a Design of Experiment model. Transfer functions, which are derived through a series of Finite Element Analysis, describe the relation between design parameter and stress levels. The transfer function can be considered as a simplified physical model which only is applicable within the range used of the design parameters. The use of transfer function is especially powerful when performing Monte Carlo simulations to determine the likelihood of plasticity.</p><p>One short coming of transfer functions is that only the parameters included from the beginning can be altered and assessed. One also have to consider the simplifications introduced through the modelling, such as transfer functions derived using linear elastic simulations can not be used for assessment of plastic deformations. The method developed in this thesis will be further developed in following studies. This report is therefore meant to serve as a guide for the next investigator at Volvo Aero Corporation.</p>
7

Probabilistic safety analysis of dams / Probabilistische Sicherheitsanalyse von Dämmen

Kassa, Negede Abate 04 October 2010 (has links) (PDF)
Successful dam design endeavor involves generating technical solutions that can meet intended functional objectives and choosing the best one among the alternative technical solutions. The process of choosing the best among the alternative technical solutions depends on evaluation of design conformance with technical specifications and reliability standards (such as capacity, environmental, safety, social, political etc pecifications). The process also involves evaluation on whether an optimal balance is set between safety and economy. The process of evaluating alternative design solutions requires generating a quantitative expression for lifetime performance and safety. An objective and numerical evaluation of lifetime performance and safety of dams is an essential but complex undertaking. Its domain involves much uncertainty (uncertainty in loads, hazards, strength parameters, boundary conditions, models and dam failure consequences) all of which should be characterized. Arguably uncertainty models and risk analysis provide the most complete characterization of dam performance and safety issues. Risk is a combined measure of the probability and severity of an adverse effect (functional and/or structural failure), and is often estimated by the product of the probability of the adverse event occurring and the expected consequences. Thus, risk analysis requires (1) determination of failure probabilities. (2) probabilistic estimation of consequences. Nonetheless, there is no adequately demonstrated, satisfactorily comprehensive and precise method for explicit treatment and integration of all uncertainties in variables of dam design and risk analysis. Therefore, there is a need for evaluating existing uncertainty models for their applicability, to see knowledge and realization gaps, to drive or adopt new approaches and tools and to adequately demonstrate their practicability by using real life case studies. This is required not only for hopefully improving the performance and safety evaluation process accuracy but also for getting better acceptance of the probabilistic approaches by those who took deterministic design based research and engineering practices as their life time career. These problems have motivated the initiation of this research. In this research the following have been accomplished: (1) Identified various ways of analyzing and representing uncertainty in dam design parameters pertinent to three dominant dam failure causes (sliding, overtopping and seepage), and tested a suite of stochastic models capable of capturing design parameters uncertainty to better facilitate evaluation of failure probabilities; (2) Studied three classical stochastic models: Monte Carlo Simulation Method (MCSM), First Order Second Moment (FOSM) and Second Order Second Moment (SOSM), and applied them for modeling dam performance and for evaluating failure probabilities in line with the above mentioned dominant dam failure causes; (3) Presented an exact new for the purpose analytical method of transforming design parameters distributions to a distribution representing dam performance (Analytical Solution for finding Derived Distributions (ASDD) method). Laid out proves of its basic principles, prepared a generic implementation architecture and demonstrated its applicability for the three failure modes using a real life case study data; (4) Presented a multitude of tailor-made reliability equations and solution procedures that will enable the implementations of the above stochastic and analytical methods for failure probability evaluation; (5) Implemented the stochastic and analytical methods using real life data pertinent to the three failure mechanisms from Tendaho Dam, Ethiopia. Compared the performance of the various stochastic and analytical methods with each other and with the classical deterministic design approach; and (6) Provided solution procedures, implementation architectures, and Mathematica 5.2, Crystal Ball 7 and spreadsheet based tools for doing the above mentioned analysis. The results indicate that: (1) The proposed approaches provide a valid set of procedures, internally consistent logic and produce more realistic solutions. Using the approaches engineers could design dams to meet a quantified level of performance (volume of failure) and could set a balance between safety and economy; (2) The research is assumed to bridge the gap between the available probability theories in one hand and the suffering distribution problems in dam safety evaluation on the other; (3) Out of the suite of stochastic approaches studied the ASDD method out perform the classical methods (MCSM, FOSM and SOSM methods) by its theoretical foundation, accuracy and reproducibility. However, when compared with deterministic approach, each of the stochastic approaches provides valid set of procedures, consistent logic and they gave more realistic solution. Nonetheless, it is good practice to compare results from the proposed probabilistic approaches; (4) The different tailor-made reliability equations and solution approaches followed are proved to work for stochastic safety evaluation of dams; and (5) The research drawn from some important conclusions and lessons, in relation to stochastic safety analysis of dams against the three dominant failure mechanisms, are. The end result of the study should provide dam engineers and decision makers with perspectives, methodologies, techniques and tools that help them better understand dam safety related issues and enable them to conduct quantitative safety analysis and thus make intelligent dam design, upgrading and rehabilitation decisions.
8

Parametric study of manifolds using finite element analysis

Bäckström, Kristoffer January 2008 (has links)
Volvo Aero Corporation takes part in a development project called Future Launchers Preparatory Program (FLPP) which aims to develop Next Generation Launchers (NGL) for future space flights. FLPP involves several projects and one these are focused on the development of the next generation rocket engines for the NGL. The environment of a rocket engine is extremely hostile, characterized by high pressure levels and rapid thermal transients. Even though the components are manufactured from super alloys, the life of these components is measured in seconds. In the light of these facts, it is obvious that all components have to be optimized to the last detail. This thesis work is a part of the optimization procedure with the objective to perform a parametric study of manifolds that will be particular useful during the concept work of the turbines for the FLPP program. The methods of probabilistic analysis have been employed in this study. This approach involves Ishikawa analysis (Cause and Effects) as well deriving transfer functions through defining and performing simulations in a structured manner according to a Design of Experiment model. Transfer functions, which are derived through a series of Finite Element Analysis, describe the relation between design parameter and stress levels. The transfer function can be considered as a simplified physical model which only is applicable within the range used of the design parameters. The use of transfer function is especially powerful when performing Monte Carlo simulations to determine the likelihood of plasticity. One short coming of transfer functions is that only the parameters included from the beginning can be altered and assessed. One also have to consider the simplifications introduced through the modelling, such as transfer functions derived using linear elastic simulations can not be used for assessment of plastic deformations. The method developed in this thesis will be further developed in following studies. This report is therefore meant to serve as a guide for the next investigator at Volvo Aero Corporation.
9

Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading Components

Seecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability. Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime. The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated. This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
10

Evaluation and design of optimum support systems in South African collieries using the probabilistic design approach

Canbulat, Ismet 28 July 2008 (has links)
This thesis addresses the problem of designing roof support systems in coal mines. When designing the roof support, it is necessary to account for the uncertainties that are inherently exist within the rock mass and support elements. The performance of a support system is affected by these uncertainties, which are not taken into account in the current design methodologies used in South Africa. This study sets out to develop a method which takes all uncertainties into account and quantitatively provides a risk-based design. Despite the fact that the roof bolting is probably one of the most researched aspects of coal mine ground control, falls of ground still remain the single major cause of fatalities and injuries in South African collieries. Mainly five different support design methodologies have been used; namely, analytical modelling, numerical modelling, physical modelling, design based on geotechnical rating systems and field testing. As part of this study, it is shown that there are many elements of a support system that can impact the support and roof behaviour in a coal mine and the characteristics of these elements as well as the interaction between them is complex and can vary significantly within a short distance. These variations account for uncertainties in coal mine roof support and they are usually not taken into account in the above design methodologies resulting in falls of ground and/or over design of support systems. The roof and support behaviour were monitored at 29 sites at five collieries. It is found that there was no evidence of a dramatic increase in the stable elevations as experienced in some overseas collieries. A roadway widening experiment was carried out to establish the critical roof displacements. The maximum width attained was 12 m at which stage 5 mm displacement was measured. During the monitoring period no roof falls occurred at any of the 29 sites and road widening experiment site, even where 12 mm displacements were measured. The in situ monitoring programme was continued in additional 26 monitoring stations in 13 sites with the aim of establishing the effect of unsupported cut-out distance on roof and support performances. The results showed that the lithological composition of the roof strata plays a major role in the amount of deflection that was recorded. Bedding separation was seen to occur at the contacts between different strata types. It is concluded that the roof behaved like a set of composite beams with different characteristics. It is also found that the amounts of deflection corresponded with the deflection that would be expected from gravity loaded beams. During this monitoring programme variable nature of roof and support systems are also demonstrated. As many mines use different geotechnical rating systems, an evaluation of the currently used classification techniques were conducted to determine their effectiveness in design of roof support strategies. It is found that currently used systems cannot quantitatively determine the required support system in a given geotechnical environment. Impact splitting tests are found to be the appropriate system for South African conditions. It is however concluded that the roof lithology, stress regime and roof characteristics can change within meters in a production section. Therefore, in order to predict these changing conditions many boreholes are required for a section, which would be costly and time consuming. An in-depth study into the roof support elements was conducted for the purpose of obtaining an understanding of the fundamental mechanisms of roof support systems and developing guidelines for their improvement. All of the currently available roof bolt support elements and related machinery were evaluated using in situ short encapsulated pull tests. The results showed that, on average, bond strengths obtained from the roof bolts supplied by different manufacturers can vary as much as 28 per cent. The test results conducted on different resins showed that the strength of resin currently being used in South Africa is adequate. Differences between commonly used bit types were established. It is concluded that the 2-prong bit outperforms the spade bit in sandstone and shale rock types. In addition, the effect of hole annulus was also investigated as part of this study. The results show that an annulus between 2.5 mm to 3.8 mm resulted in the most effective bond strengths. The effect of wet and dry drilling was noted. It is found that bond strengths and overall support stiffnesses are greater with the use of the wet drilling in all resin types. The results from the tests in different rock types highlighted the very distinct differences between bolt system performances. Quality control procedures for compliance with the design, support elements and quality of installation are presented. Recommendations for improving the quality control measures and for developing testing procedures for bolt system components, installation quality and resin performance are provided. Finally, a roof support design methodology that takes into account all natural variations exist within the rock mass and the mining process has been developed and presented. This was achieved by adapting a probabilistic design approach using the well established stochastic modelling technique. This methodology enables rock engineers to design roof support systems with greater confidence and should result in safer and economic extraction of coal reserves. / Thesis (PhD)--University of Pretoria, 2008. / Mining Engineering / unrestricted

Page generated in 0.0959 seconds