Spelling suggestions: "subject:"design under uncertainty"" "subject:"1design under uncertainty""
1 |
Comparing Probabilistic and Fuzzy Set Approaches for Designing in the Presence of UncertaintyChen, Qinghong 18 September 2000 (has links)
Probabilistic models and fuzzy set models describe different aspects of uncertainty. Probabilistic models primarily describe random variability in parameters. In engineering system safety, examples are variability in material properties, geometrical dimensions, or wind loads. In contrast, fuzzy set models of uncertainty primarily describe vagueness, such as vagueness in the definition of safety.
When there is only limited information about variability, it is possible to use probabilistic models by making suitable assumptions on the statistics of the variability. However, it has been repeatedly shown that this can entail serious errors. Fuzzy set models, which require little data, appear to be well suited to use with designing for uncertainty, when little is known about the uncertainty. Several studies have compared fuzzy set and probabilistic methods in analysis of safety of systems under uncertainty. However, no study has compared the two approaches systematically as a function of the amount of available information. Such a comparison, in the context of design against failure, is the objective of this dissertation.
First, the theoretical foundations of probability and possibility theories are compared. We show that a major difference between probability and possibility is in the axioms about the union of events. Because of this difference, probability and possibility calculi are fundamentally different and one cannot simulate possibility calculus using probabilistic models. We also show that possibility-based methods tend to be more conservative than probability-based methods in systems that fail only if many unfavorable events occur simultaneously.
Based on these theoretical observations, two design problems are formulated to demonstrate the strength and weakness of probabilistic and fuzzy set methods. We consider the design of tuned damper system and the design and construction of domino stacks. These problems contain narrow failure zones in their uncertain variables and are tailored to demonstrate the pitfalls of probabilistic methods when little information is available for uncertain variables.
Using these design problems we demonstrate that probabilistic methods are better than possibility-based methods if sufficient information is available. Just as importantly, we show possibility-based methods can be better if little information is available. Our conclusion is that when there is little information available about uncertainties, a hybrid method should be used to ensure a safe design. / Ph. D.
|
2 |
Systematic Design of Bulk Recycling Systems under UncertaintyWei, Jing 13 May 2004 (has links)
The fast growing waste stream of electronic and other complex consumer products is making the bulk recycling problem an important environmental protection issue. These products must be recycled because they contain hazardous materials such as lead and mercury. The focus of this thesis is the development of systematic methods for designing systems to recover mixed plastics from electronic products such as computers and televisions.
Bulk recycling systems are similar to other chemical engineering process systems. Therefore they can be synthesized and designed using some existing techniques that have been applied to distillation and reaction systems. However, the existence of various uncertainties from different sources, such as the variation of component fractions and product prices, makes it crucial to design a flexible and sustainable system, and is also a major challenge in this research. Another challenge is that plastics can be separated by different mechanisms based on different properties, but separating a mix of plastics often requires using a combination of different methods because they can have overlapping differentiating properties. Therefore many decisions are to be made including which methods to choose and how to connect them.
To address the problem systematically, the design under uncertainty problem was formulated as a stochastic Mixed Integer Nonlinear Program (sMINLP). A Sample Average Approximation (SAA) method wrapped on the Outer Approximation method has been developed in this thesis to solve such problems efficiently. Therefore, large design under uncertainty problems can be solved without intractable computational difficulty. To allow making choices from separation methods by different mechanisms, this research modeled various plastics separation methods taking account of the distribution of particle properties and unified them using a canonical partition curve representation. Finally, an overall design method was proposed in this work to incorporate the design of size reduction units into the separation system.
This research is the first formal development of a systematic method in this area to account for uncertainties and interactions between process steps.
|
3 |
Efficient Nonlinear Optimization with Rigorous Models for Large Scale Industrial Chemical ProcessesZhu, Yu 2011 May 1900 (has links)
Large scale nonlinear programming (NLP) has proven to be an effective framework
for obtaining profit gains through optimal process design and operations in
chemical engineering. While the classical SQP and Interior Point methods have been
successfully applied to solve many optimization problems, the focus of both academia
and industry on larger and more complicated problems requires further development
of numerical algorithms which can provide improved computational efficiency.
The primary purpose of this dissertation is to develop effective problem formulations
and an advanced numerical algorithms for efficient solution of these challenging
problems. As problem sizes increase, there is a need for tailored algorithms that
can exploit problem specific structure. Furthermore, computer chip manufacturers
are no longer focusing on increased clock-speeds, but rather on hyperthreading and
multi-core architectures. Therefore, to see continued performance improvement, we
must focus on algorithms that can exploit emerging parallel computing architectures.
In this dissertation, we develop an advanced parallel solution strategy for nonlinear
programming problems with block-angular structure. The effectiveness of this and
modern off-the-shelf tools are demonstrated on a wide range of problem classes.
Here, we treat optimal design, optimal operation, dynamic optimization, and
parameter estimation. Two case studies (air separation units and heat-integrated columns) are investigated to deal with design under uncertainty with rigorous models.
For optimal operation, this dissertation takes cryogenic air separation units as
a primary case study and focuses on formulations for handling uncertain product
demands, contractual constraints on customer satisfaction levels, and variable power
pricing. Multiperiod formulations provide operating plans that consider inventory to
meet customer demands and improve profits.
In the area of dynamic optimization, optimal reference trajectories are determined
for load changes in an air separation process. A multiscenario programming
formulation is again used, this time with large-scale discretized dynamic models.
Finally, to emphasize a different decomposition approach, we address a problem
with significant spatial complexity. Unknown water demands within a large scale
city-wide distribution network are estimated. This problem provides a different decomposition
mechanism than the multiscenario or multiperiod problems; nevertheless,
our parallel approach provides effective speedup.
|
4 |
Development and Use of a Spatially Accurate Polynomial Chaos Method for Aerospace ApplicationsSchaefer, John Anthony 24 January 2023 (has links)
Uncertainty is prevalent throughout the design, analysis, and optimization of aerospace products. When scientific computing is used to support these tasks, sources of uncertainty may include the freestream flight conditions of a vehicle, physical modeling parameters, geometric fidelity, numerical error, and model-form uncertainty, among others. Moreover, while some uncertainties may be treated as probabilistic, aleatory sources, other uncertainties are non-probabilistic and epistemic due to a lack of knowledge, and cannot be rigorously treated using classical statistics or Bayesian approaches. An additional complication for propagating uncertainty is that many aerospace scientific computing tools may be computationally expensive; for example, a single high-fidelity computational fluid dynamics solution may require several days or even weeks to complete. It is therefore necessary to employ uncertainty propagation strategies that require as few solutions as possible.
The Non-Intrusive Polynomial Chaos (NIPC) method has grown in popularity in recent decades due to its ability to propagate both aleatory and epistemic parametric sources of uncertainty in a computationally efficient manner. While traditional Monte Carlo methods might require thousands to millions of function evaluations to achieve statistical convergence, NIPC typically requires tens to hundreds for problems with similar numbers of uncertain dimensions. Despite this efficiency, NIPC is limited in one important aspect: it can only propagate uncertainty at a particular point in a design space or flight envelope. For optimization or aerodynamic database problems that require uncertainty estimates at many more than one point, the use of NIPC quickly becomes computationally intractable.
This dissertation introduces a new method entitled Spatially Accurate Polynomial Chaos (SAPC) that extends the original NIPC approach for the spatial regression of aleatory and epistemic parametric sources of uncertainty. Throughout the dissertation, the SAPC method is applied to various aerospace problems of interest. These include the regression of aerodynamic force and moment uncertainties throughout the flight envelope of a commercial aircraft, the design under uncertainty of a two-stream propulsive mixer device, and the robust design of a low-boom supersonic demonstrator aircraft. Collectively the results suggest that SAPC may be useful for a large variety of engineering applications. / Doctor of Philosophy / Uncertainty is prevalent throughout the design, analysis, and optimization of aerospace products. When scientific computer simulations are used to support these tasks, sources of uncertainty may include the speed of an aerospace vehicle, the direction of the wind, physical modeling constants or assumptions, and the vehicle shape, among others. As a result of these sources uncertainty, assessments of vehicle performance are also uncertain. For example, if the speed of a vehicle is not known precisely, then computer simulations will predict a lift force which is also imprecisely known.
A challenge when assessing the uncertainty in aerospace vehicle performance is that the computer simulations which predict performance may take a long time to run, even on state-of-the-art super computers. Traditional statistical methods may require thousands or millions of simulations for the prediction of uncertainty, which does not fit within the computational budget of most aerospace analyses. A newer method called Non-Intrusive Polynomial Chaos (NIPC) is more efficient, typically requiring only tens to hundreds of simulations; however, NIPC only provides uncertainty estimates at a single point in an aircraft flight envelope or design condition.
In this dissertation, a new method called Spatially Accurate Polynomial Chaos (SAPC) is developed. The SAPC method combines desirable features of NIPC with regression methods for an efficient estimation of uncertainty throughout a vehicle flight envelope or design space. Throughout the dissertation, the SAPC method is applied to various aerospace problems of interest. These include the regression of aerodynamic force and moment uncertainties throughout the flight envelope of a commercial aircraft, the design under uncertainty of a two-stream propulsive mixer device, and the robust design of a low-boom supersonic demonstrator aircraft. Collectively the results suggest that SAPC may be useful for a large variety of engineering applications.
|
5 |
Computational and Machine Learning-Reinforced Modeling and Design of Materials under UncertaintyHasan, Md Mahmudul 05 July 2023 (has links)
The component-level performance of materials is fundamentally determined by the underlying microstructural features. Therefore, designing high-performance materials using multi-scale models plays a significant role to improve the predictability, reliability, proper functioning, and longevity of components for a wide range of applications in the fields of aerospace, electronics, energy, and structural engineering. This thesis aims to develop new methodologies to design microstructures under inherent material uncertainty by incorporating machine learning techniques. To achieve this objective, the study addresses gradient-based and machine learning-driven design optimization methods to enhance homogenized linear and non-linear properties of polycrystalline microstructures. However, variations arising from the thermo-mechanical processing of materials affect microstructural features and properties by propagating over multiple length scales. To quantify this inherent microstructural uncertainty, this study introduces a linear programming-based analytical method. When this analytical uncertainty quantification formulation is not applicable (e.g., uncertainty propagation on non-linear properties), a machine learning-based inverse design approach is presented to quantify the microstructural uncertainty. Example design problems are discussed for different polycrystalline systems (e.g., Titanium, Aluminium, and Galfenol). Though conventional machine learning performs well when used for designing microstructures or modeling material properties, its predictions may still fail to satisfy design constraints associated with the physics of the system. Therefore, the physics-informed neural network (PINN) is developed to incorporate problem physics in the machine learning formulation. In this study, a PINN model is built and integrated into materials design to study the deformation processes of Copper and a Titanium-Aluminum alloy. / Doctor of Philosophy / Microstructure-sensitive design is a high-throughput computational approach for materials design, where material performance is improved through the control and design of microstructures. It enhances component performance and, subsequently, the overall system's performance at the application level. This thesis aims to design microstructures for polycrystalline materials such as Galfenol, Titanium-Aluminum alloys, and Copper to obtain desired mechanical properties for certain applications. The advantage of the microstructure-sensitive design approach is that multiple microstructures can be suggested, which provide a similar value of the design parameters. Therefore, manufacturers can follow any of these microstructure designs to fabricate the materials with the desired properties. Moreover, the microstructure uncertainty arising from the variations in thermo-mechanical processing and measurement of the experimental data is quantified. It is necessary to address the resultant randomness of the microstructure because it can alter the expected mechanical properties. To check the manufacturability of proposed microstructure designs, a physics-informed machine learning model is developed to build a relation between the process, microstructure, and material properties. This model can be used to solve the process design problem to identify the processing parameters to achieve a given/desired microstructure.
|
6 |
OPTIMIZATION TECHNIQUES FOR PHARMACEUTICAL MANUFACTURING AND DESIGN SPACE ANALYSISDaniel Joseph Laky (13120485) 21 July 2022 (has links)
<p>In this dissertation, numerical analysis frameworks and software tools for digital design of process systems are developed. More specifically, these tools have been focused on digital design within the pharmaceutical manufacturing space. Batch processing represents the traditional and still predominant pathway to manufacture pharmaceuticals in both the drug substance and drug product spaces. Drug substance processes start with raw materials or precursors to produce an active pharmaceutical ingredient (API) through synthesis and purification. Drug product processes take this pure API in powder form, add excipients, and process the powder into consumer doses such as capsules or tablets. Continuous manufacturing has allowed many other chemical industries to take advantage of real-time process management through process control, process optimization, and real-time detection of off-spec material. Also, the possibility to reduce total cleaning time of units and encourage green chemistry through solvent reduction or recycling make continuous manufacturing an attractive alternative to batch manufacturing. However, to fully understand and take advantage of real-time process management, digital tools are required, both as soft sensors during process control or during process design and optimization. Since the shift from batch to continuous manufacturing will proceed in stages, processes will likely adopt both continuous and batch unit operations in the same process, which we will call {\em hybrid} pharmaceutical manufacturing routes. Even though these processes will soon become common in the industry, digital tools that address comparison of batch, hybrid, and continuous manufacturing routes in the pharmaceutical space are lacking. This is especially true when considering hybrid routes. For this reason, PharmaPy, an open-source tool for pharmaceutical process development, was created to address rapid in-silico design of hybrid pharmaceutical processes. Throughout this work, the focus is on analyzing alternative operating modes within the drug substance manufacturing context. First, the mathematical models for PharmaPy's synthesis, crystallization, and filtration units are discussed. Then, the simulation capabilities of PharmaPy are highlighted, showcasing dynamic simulation of both fully continuous and hybrid processes. However, the technical focus of the work as a whole is primarily on optimization techniques for pharmaceutical process design. Thus, many derivative-free optimization frameworks for simulation-optimization were constructed and utilized with PharmaPy performing simulations of pharmaceutical processes. The timeline of work originally began with derivative-based methods to solve mixed-integer programs (MIP) for water network sampling and security, as well as nonlinear programs (NLPs) and some mixed-integer nonlinear programs (MINLPs) for design space and feasibility analysis. Therefore, a method for process design that combines both the ease of implementation from a process simulator (PharmaPy) with the computational performance of derivative-based optimization was implemented. Recent developments in Pyomo through the PyNumero package allow callbacks to an input-output or black-box model while using {\sc Ipopt} as a derivative-based solver through the cyipopt interface. Using this approach, it was found that using a PharmaPy simulation as a black box within a derivative-based solver resulted in quicker solve times when compared with traditional derivative-free optimization strategies, and offers a much quicker implementation strategy than using a simultaneous equation-oriented algebraic definition of the problem. Also, uncertainty exists in virtually all process systems. Traditionally, uncertainty is analyzed through sampling approaches such as Monte Carlo simulation. These sampling approaches quickly become computational obstacles as problem scale increases. In the 1980s, chemical plant design under uncertainty through {\em flexibility analysis} became an option for explicitly considering model uncertainty using mathematical programming. However, such formulations provide computational obstacles of their own as most process models produce challenging MINLPs under the flexibility analysis framework. Specifically when considering pharmaceutical processes, recent initiatives by the FDA have peaked interest in flexibility analysis because of the so called {\em design space}. The design space is the region for which critical quality attributes (CQAs) may be guaranteed over a set of interactions between the inputs and process parameters. Since uncertainty is intrinsic to such operations, industry is interested in guaranteeing that CQAs hold with a set confidence level over a given operating region. In this work, the {\em probabilistic design space} defined by these levels of confidence is presented to address the computational advantages of using a fully model-based flexibility analysis framework instead of a Monte Carlo sampling approach. From the results, it is seen that using the flexibility analysis framework decreased design space identification time by more than two orders of magnitude. Given implementation difficulty with new digital tools for both students and professionals, educational material was developed for PharmaPy and was presented as part of a pharmaceutical API process development course at Purdue. The students were surveyed afterward and many of the students found the framework to be approachable through the use of Jupyter notebooks, and would consider using PharmaPy and Python for pharmaceutical modeling and data analysis in the future, respectively. Through software development and the development of numerical analysis frameworks, digital design of pharmaceutical processes has expanded and become more approachable. The incorporation of rigorous simulations under process uncertainty promotes the use of digital tools in regulatory filings and reduces unnecessary process development costs using model-based design. Examples of these improvements are evident through the development of PharmaPy, a simulation-optimization framework using PharmaPy, and flexibility analysis tools. These tools resulted in a computational benefit of 1 to 2 orders of magnitude when compared to methods used in practice and in some cases reduce the modeling time required to determine optimal operating conditions, or the design space of a pharmaceutical manufacturing process.</p>
|
7 |
Multidisciplinary Design Under Uncertainty Framework of a Spacecraft and Trajectory for an Interplanetary MissionSiddhesh Ajay Naidu (18437880) 28 April 2024 (has links)
<p dir="ltr">Design under uncertainty (DUU) for spacecraft is crucial in ensuring mission success, especially given the criticality of their failure. To obtain a more realistic understanding of space systems, it is beneficial to holistically couple the modeling of the spacecraft and its trajectory as a multidisciplinary analysis (MDA). In this work, a MDA model is developed for an Earth-Mars mission by employing the general mission analysis tool (GMAT) to model the mission trajectory and rocket propulsion analysis (RPA) to design the engines. By utilizing this direct MDA model, the deterministic optimization (DO) of the system is performed first and yields a design that completed the mission in 307 days while requiring 475 kg of fuel. The direct MDA model is also integrated into a Monte Carlo simulation (MCS) to investigate the uncertainty quantification (UQ) of the spacecraft and trajectory system. When considering the combined uncertainty in the launch date for a 20-day window and the specific impulses, the time of flight ranges from 275 to 330 days and the total fuel consumption ranges from 475 to 950 kg. The spacecraft velocity exhibits deviations ranging from 2 to 4 km/s at any given instance in the Earth inertial frame. The amount of fuel consumed during the TCM ranges from 1 to 250 kg, while during the MOI, the amount of fuel consumed ranges from 350 to 810 kg. The usage of the direct MDA model for optimization and uncertainty quantification of the system can be computationally prohibitive for DUU. To address this challenge, the effectiveness of utilizing surrogate-based approaches for performing UQ is demonstrated, resulting in significantly lower computational costs. Gaussian processes (GP) models trained on data from the MDA model were implemented into the UQ framework and their results were compared to those of the direct MDA method. When considering the combined uncertainty from both sources, the surrogate-based method had a mean error of 1.67% and required only 29% of the computational time. When compared to the direct MDA, the time of flight range matched well. While the TCM and MOI fuel consumption ranges were smaller by 5 kg. These GP models were integrated into the DUU framework to perform reliability-based design optimization (RBDO) feasibly for the spacecraft and trajectory system. For the combined uncertainty, the DO design yielded a poor reliability of 54%, underscoring the necessity for performing RBDO. The DUU framework obtained a design with a significantly improved reliability of 99%, which required an additional 39.19 kg of fuel and also resulted in a reduced time of flight by 0.55 days.</p>
|
8 |
Methodology for the conceptual design of a robust and opportunistic system-of-systemsTalley, Diana Noonan 18 November 2008 (has links)
Systems are becoming more complicated, complex, and interrelated. Designers have recognized the need to develop systems from a holistic perspective and design them as Systems-of-Systems (SoS). The design of the SoS, especially in the conceptual design phase, is generally characterized by significant uncertainty. As a result, it is possible for all three types of uncertainty (aleatory, epistemic, and error) and the associated factors of uncertainty (randomness, sampling, confusion, conflict, inaccuracy, ambiguity, vagueness, coarseness, and simplification) to affect the design process. While there are a number of existing SoS design methods, several gaps have been identified: the ability to modeling all of the factors of uncertainty at varying levels of knowledge; the ability to consider both the pernicious and propitious aspects of uncertainty; and, the ability to determine the value of reducing the uncertainty in the design process.
While there are numerous uncertainty modeling theories, no one theory can effectively model every kind of uncertainty. This research presents a Hybrid Uncertainty Modeling Method (HUMM) that integrates techniques from the following theories: Probability Theory, Evidence Theory, Fuzzy Set Theory, and Info-Gap theory. The HUMM is capable of modeling all of the different factors of uncertainty and can model the uncertainty for multiple levels of knowledge.
In the design process, there are both pernicious and propitious characteristics associated with the uncertainty. Existing design methods typically focus on developing robust designs that are insensitive to the associated uncertainty. These methods do not capitalize on the possibility of maximizing the potential benefit associated with the uncertainty. This research demonstrates how these deficiencies can be overcome by identifying the most robust and opportunistic design.
In a design process it is possible that the most robust and opportunistic design will not be selected from the set of potential design alternatives due to the related uncertainty. This research presents a process called the Value of Reducing Uncertainty Method (VRUM) that can determine the value associated with reducing the uncertainty in the design problem before a final decision is made by utilizing two concepts: the Expected Value of Reducing Uncertainty (EVRU) and the Expected Cost to Reducing Uncertainty (ECRU).
|
9 |
Robust design methodology for common core gas turbine enginesSands, Jonathan Stephen 08 June 2015 (has links)
A gas turbine engine design process was developed for the design of a common core engine family. The process considers initial and projected variant engine applications, likely technology maturation, and various sources of uncertainty when making initial core design considerations. A physics based modeling and simulation environment was developed to enforce geometric core commonality between the core defining design engine and a common core variant engine. The environment also allows for upgrade options and technology to be infused into the variant engine design. The relationships established in the model enable commonality to be implicitly enforced when performing simultaneous design space explorations of the common core design and all corresponding variant engine designs. A robust design simulation process was also developed, enabling probabilistic surrogate model representations of the common core engine family design space to be produced. The probabilistic models provide confidence interval performance estimates with a single function call for an inputted set of core and variant design settings and the uncertainty distribution shape parameter settings representative of an uncertainty scenario of interest. The unique form of the probabilistic surrogate models enables large numbers of common core engine family applications to be considered simultaneously, each being simulated under a unique uncertainty scenario. Implications of core design options can be instantaneously predicted for all engine applications considered, allowing for favorable common core design regions to be identified in a highly efficient manner.
|
10 |
Shape optimization of lightweight structures under blast loadingIsrael, Joshua James 05 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Structural optimization of vehicle components for blast mitigation seeks to counteract the damaging effects of an impulsive threat on occupants and critical components. The strong and urgent need for improved protection from blast events has made blast mitigating component design an active research subject. Standard up-armoring of ground vehicles can significantly increase the mass of the vehicle. Without concurrent modifications to the power train, suspension, braking and steering components, the up-armored vehicles suffer from degraded stability and mobility. For these reasons, there is a critical need for effective methods to generate lightweight components for blast mitigation.
The overall objective of this research is to make advances in structural design methods for the optimization of lightweight blast-mitigating systems. This thesis investigates the automated design process of isotropic plates to mitigate the effects of blast loading by addressing the design of blast-protective structures from a design optimization perspective. The general design problem is stated as finding the optimum shape of a protective shell of minimum mass satisfying deformation and envelops constraints.
This research was conducted in terms of three primary research projects. The first project was to investigate the design of lightweight structures under deterministic loading
conditions and subject to the same objective function and constraints, in order to compare feasible design methodologies through the expansion of the problem dimension in order to reach the limits of performance. The second research project involved the investigation of recently developed uncertainty quantification methods, the univariate dimensional reduction method and the performance moment integration method, to structures under stochastic loading conditions. The third research project involved application of these uncertainty quantification methods to problems of design optimization under uncertainty, in order to develop a methodology for the generation of lightweight reliable structures.
This research has resulted in the construction of a computational framework, incorporating uncertainty quantification methods and various optimization techniques, which can be used for the generation of lightweight structures for blast mitigation under uncertainty. Applied to practical structural design problems, the results demonstrate that the methodologies provide a practical tool to aid the design engineer in generating design concepts for blast-mitigating structures. These methods can be used to advance research into the generation of reliable structures under uncertain loading conditions inherent to blast events.
|
Page generated in 0.0905 seconds