• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 5
  • 3
  • Tagged with
  • 43
  • 43
  • 24
  • 20
  • 19
  • 11
  • 9
  • 9
  • 8
  • 8
  • 8
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Mission-Integrated Synthesis/Design Optimization of Aerospace Subsystems under Transient Conditions

Weise, Peter Carl 10 October 2012 (has links)
The equations governing the thermodynamic behavior of a military aircraft have been implemented by the Air Force Research Lab (AFRL) and other Integrated Vehicle Energy Technology Demonstration (INVENT) contributors into a cohesive, adaptable, dynamic aircraft simulation program in Mathworks' Simulink®. The resulting model known as the "Tip-to-tail" model meets the design specifications set forth by the INVENT program. The system consists of six intimately linked subsystems that include a propulsion subsystem (PS), air vehicle subsystem (AVS), robust electrical power subsystem (REPS), high power electric actuation subsystem (HPEAS), advanced power and thermal management subsystem (APTMS), and a fuel thermal management subsystem (FTMS). The model's governing equations are augmented with experimental data and supported by defined physical parameters. In order to address the problems associated with the additional power and thermal loads for in more electric aircraft (MEA), this research utilizes exergy analysis and mission-integrated synthesis/design optimization to investigate the potential for improvement in tip-to-tail design/performance. Additionally, this thesis describes the development and integration of higher fidelity transient heat exchanger models for use in the tip-to-tail. Finally, the change in performance due to the integration of new heat exchanger models developed here is presented. Additionally, this thesis discusses the results obtained by performing mission-integrated synthesis/design optimization on the tip-to-tail using heat exchanger design parameters as decision variables. These results show that the performance of the tip-to-thermal management subsystems improves significantly due to the integration of the heat exchanger models. These results also show improvements in vehicle performance due to the mission-integrated optimization. / Master of Science
2

On the Pareto-Following Variation Operator for fast converging Multiobjective Evolutionary Algorithms

Talukder, A. K. M. K. A. January 2008 (has links)
The focus of this research is to provide an efficient approach to deal with computationally expensive Multiobjective Optimization Problems (MOP’s). Typically, approximation or surrogate based techniques are adopted to reduce the computational cost. In such cases, the original expensive objective function is replaced by a cheaper mathematical model, where this model mimics the behavior/input-output (i.e. design variable – objective value) relationship. However, it is difficult to model an exact substitute of the targeted objective function. Furthermore, if this kind of approach is used in an evolutionary search, literally, the number of function evaluations does not reduce (i.e. The number of original function evaluation is replaced by the number of surrogate/approximate function evaluation). However, if a large number of individuals are considered, the surrogate model fails to offer smaller computational cost. / To tackle this problem, we have reformulated the concept of surrogate modeling in a different way, which is more suitable for the Multiobjective Evolutionary Algorithm(MOEA) paradigm. In our approach, we do not approximate the objective function; rather we model the input-output behavior of the underlying MOEA itself. The model attempts to identify the search path (in both design-variable and objective spaces) and from this trajectory the model is guaranteed to generate non-dominated solutions (especially, during the initial iterations of the underlying MOEA – with respect to the current solutions) for the next iterations of the MOEA. Therefore, the MOEA can avoid re-evaluating the dominated solutions and thus can save large amount of computational cost due to expensive function evaluations. We have designed our approximation model as a variation operator – that follows the trajectory of the fronts and can be “plugged-in” to any kind of MOEA where non-domination based selection is used. Hence it is termed– the “Pareto-Following Variation Operator (PFVO)”. This approach also provides some added advantage that we can still use the original objective function and thus the search procedure becomes robust and suitable, especially for dynamic problems. / We have integrated the model into three base-line MOEA’s: “Non-dominated Sorting Genetic Algorithm - II (NSGA-II)”, “Strength Pareto Evolutionary Algorithm - II (SPEAII)”and the recently proposed “Regularity Model Based Estimation of Distribution Algorithm (RM-MEDA)”. We have also conducted an exhaustive simulation study using several benchmark MOP’s. Detailed performance and statistical analysis reveals promising results. As an extension, we have implemented our idea for dynamic MOP’s. We have also integrated PFVO into diffusion based/cellular MOEA in a distributed/Grid environment. Most experimental results and analysis reveal that PFVO can be used as a performance enhancement tool for any kind of MOEA.
3

Non-Deterministic Metamodeling for Multidisciplinary Design Optimization of Aircraft Systems Under Uncertainty

Clark, Daniel L., Jr. 18 December 2019 (has links)
No description available.
4

A data-driven framework to support resilient and sustainable early design

Zaker Esteghamati, Mohsen 05 August 2021 (has links)
Early design is the most critical stage to improve the resiliency and sustainability of buildings. An unaided early design follows the designer's accustomed domain of knowledge and cognitive biases. Given the inherent limitations of human decision-making, such a design process will only explore a small set of alternatives using limited criteria, and most likely, miss high-performing alternatives. Performance-based engineering (PBE) is a probabilistic approach to quantify buildings performance against natural hazards in terms of decision metrics such as repair cost and functionality loss. Therefore, PBE can remarkably improve early design by informing the designer regarding the possible consequences of different decisions. Incorporating PBE in early design is obstructed by several challenges such as time- and effort-intensiveness of performing rigorous PBE assessments, a specific skillset that might not be available, and accrual of aleatoric (associated with innate randomness of physical systems properties and surrounding environment conditions) and epistemic (associated with the incomplete state of knowledge) uncertainties. In addition, a successful early design requires exploring a large number of alternatives, which, when compounded by PBE assessments, will significantly exhaust computational resources and pressure the project timeline. This dissertation proposes a framework to integrate prior knowledge and PBE assessments in early design. The primary workflow in the proposed framework develops a performance inventory to train statistical surrogate models using supervised learning algorithms. This performance inventory comprises PBE assessments consistent with building taxonomy and site, and is supported by a knowledge-based module. The knowledge-based module organizes prior published PBE assessments as a relational database to supplement the performance inventory and aid early design exploration through knowledge-based surrogate models. Lastly, the developed knowledge-based and data-driven surrogate models are implemented in a sequential design exploration scheme to estimate the performance range for a given topology and building system. The proposed framework is then applied for mid-rise concrete office buildings in Charleston, South Carolina, where seismic vulnerability and environmental performance are linked to topology and design parameters. / Doctor of Philosophy / Recent advances in structural engineering aspire to achieve higher societal objectives than focusing solely on safety. Two main current objectives are resiliency (i.e., the built environment's ability to rapidly and equitably recover after an external shock, among other definitions) and sustainability (i.e., the ability to meet current needs without preventing future generations from meeting theirs, among other definitions). Therefore, holistic design approaches are needed that can include and explicitly evaluate these objectives at different steps, particularly the earlier stages. The importance of earlier stages stems from the higher freedom to make critical decisions – such as material and building system selection – without incurring higher costs and effort on the designer. Performance-based engineering (PBE) is a quantitative approach to calculating the impact of natural hazards on the built environment. The calculated impacts from PBE can then be communicated through a more easily understood language such as monetary values. However, several challenges should be first addressed to apply PBE in early design. First, PBE assessments are time- and effort-intensive and require expertise that might not be available to the designer. Second, a typical early design exploration evaluates many alternatives, significantly increasing the already high computational and time cost. Third, PBE requires detailed design and building information which is not available at the preliminary stages. This lack of knowledge is coupled with additional uncertainties due to the random nature of natural hazards and building system characteristics (e.g., material strength or other mechanical properties). This dissertation proposes a framework to incorporate PBE in early design, and tests it for concrete mid-rise offices in Charleston, South Carolina. The centerpiece of this framework is to use data-driven modeling to learn directly from assessments. The data-driven modeling treats PBE as a pre-configured data inventory and develops statistical surrogate models (i.e., simplified mathematical models). These models can then relate early design parameters to building seismic and environmental performance. The inventory is also supported by prior knowledge, structured as a database of published literature on PBE assessments. Lastly, the knowledge-based and data-driven models are applied in a specific order to narrow the performance range for given building layout and system.
5

Efficient Incorporation of Fatigue Damage Constraints in Wind Turbine Blade Optimization

Ingersoll, Bryce Taylor 01 August 2018 (has links)
Improving the wind turbine blade design has a significant effect on the efficiency of the wind turbine. This is a challenging multi-disciplinary optimization problem. During the blade design process, the aerodynamic shapes, sizing of the structural members, and material composition must all be determined and optimized. Some previous blade design methods incorporate the wind turbine's static response with an added safety factor to account for neglected dynamic effects. Others incorporate the dynamic response, but in general is limited to a few design cases. By not fully incorporating the dynamic response of the wind turbine, the final turbine blade design is either too conservative by overemphasizing the dynamic effects or infeasible by failing to adequately account for these effects. In this work, we propose two methods which efficiently incorporate the dynamic response into the optimization routine. The first method involves iteratively calculating damage equivalent fatigue that are fixed during the optimization process. We also demonstrate the training and use of a surrogate model to efficiently estimate the fatigue damage and extreme events in the design process. This surrogate model has been generalized to be used for different rated turbines, and can predict the fatigue damage of a wind turbine with less than 5% error. In general, these alternative, more efficient methods have been shown to be an adequate replacement of the more computationally expensive method of calculating the dynamic response of the turbine within the optimization routine.
6

Real-Time Visualization of Finite Element Models Using Surrogate Modeling Methods

Heap, Ryan C. 01 August 2013 (has links)
Finite element analysis (FEA) software is used to obtain linear and non-linear solutions to one, two, and three-dimensional (3-D) geometric problems that will see a particular load and constraint case when put into service. Parametric FEA models are commonly used in iterative design processes in order to obtain an optimum model given a set of loads, constraints, objectives, and design parameters to vary. In some instances it is desirable for a designer to obtain some intuition about how changes in design parameters can affect the FEA solution of interest, before simply sending the model through the optimization loop. This could be accomplished by running the FEA on the parametric model for a set of part family members, but this can be very timeconsuming and only gives snapshots of the models real behavior. The purpose of this thesis is to investigate a method of visualizing the FEA solution of the parametric model as design parameters are changed in real-time by approximating the FEA solution using surrogate modeling methods. The tools this research will utilize are parametric FEA modeling, surrogate modeling methods, and visualization methods. A parametric FEA model can be developed that includes mesh morphing algorithms that allow the mesh to change parametrically along with the model geometry. This allows the surrogate models assigned to each individual node to use the nodal solution of multiple finite element analyses as regression points to approximate the FEA solution. The surrogate models can then be mapped to their respective geometric locations in real-time. Solution contours display the results of the FEA calculations and are updated in real-time as the parameters of the design model change.
7

Calibration of Flush Air Data Sensing Systems Using Surrogate Modeling Techniques

January 2011 (has links)
In this work the problem of calibrating Flush Air Data Sensing (FADS) has been addressed. The inverse problem of extracting freestream wind speed and angle of attack from pressure measurements has been solved. The aim of this work was to develop machine learning and statistical tools to optimize design and calibration of FADS systems. Experimental and Computational Fluid Dynamics (EFD and CFD) solve the forward problem of determining the pressure distribution given the wind velocity profile and bluff body geometry. In this work three ways are presented in which machine learning techniques can improve calibration of FADS systems. First, a scattered data approximation scheme, called Sequential Function Approximation (SFA) that successfully solved the current inverse problem was developed. The proposed scheme is a greedy and self-adaptive technique that constructs reliable and robust estimates without any user-interaction. Wind speed and direction prediction algorithms were developed for two FADS problems. One where pressure sensors are installed on a surface vessel and the other where sensors are installed on the Runway Assisted Landing Site (RALS) control tower. Second, a Tikhonov regularization based data-model fusion technique with SFA was developed to fuse low fidelity CFD solutions with noisy and sparse wind tunnel data. The purpose of this data model fusion approach was to obtain high fidelity, smooth and noiseless flow field solutions by using only a few discrete experimental measurements and a low fidelity numerical solution. This physics based regularization technique gave better flow field solutions compared to smoothness based solutions when wind tunnel data is sparse and incomplete. Third, a sequential design strategy was developed with SFA using Active Learning techniques from the machine learning theory and Optimal Design of Experiments from statistics for regression and classification problems. Uncertainty Sampling was used with SFA to demonstrate the effectiveness of active learning versus passive learning on a cavity flow classification problem. A sequential G-optimal design procedure was also developed with SFA for regression problems. The effectiveness of this approach was demonstrated on a simulated problem and the above mentioned FADS problem.
8

3D Model of Fuel Tank for System Simulation : A methodology for combining CAD models with simulation tools

Wikström, Jonas January 2011 (has links)
Engineering aircraft systems is a complex task. Therefore models and computer simulations are needed to test functions and behaviors of non existing systems, reduce testing time and cost, reduce the risk involved and to detect problems early which reduce the amount of implementation errors. At the section Vehicle Simulation and Thermal Analysis at Saab Aeronautics in Linköping every basic aircraft system is designed and simulated, for example the fuel system. Currently 2-dimensional rectangular blocks are used in the simulation model to represent the fuel tanks. However, this is too simplistic to allow a more detailed analysis. The model needs to be extended with a more complex description of the tank geometry in order to get a more accurate model. This report explains the different steps in the developed methodology for combining 3-dimensional geometry models of any fuel tank created in CATIA with dynamic simulation of the fuel system in Dymola. The new 3-dimensional representation of the tank in Dymola should be able to calculate fuel surface location during simulation of a maneuvering aircraft.  The first step of the methodology is to create a solid model of the fuel contents in the tank. Then the area of validity for the model has to be specified, in this step all possible orientations of the fuel acceleration vector within the area of validity is generated. All these orientations are used in the automated volume analysis in CATIA. For each orientation CATIA splits the fuel body in a specified number of volumes and records the volume, the location of the fuel surface and the location of the center of gravity. This recorded data is then approximated with the use of radial basis functions implemented in MATLAB. In MATLAB a surrogate model is created which are then implemented in Dymola. In this way any fuel surface location and center of gravity can be calculated in an efficient way based on the orientation of the fuel acceleration vector and the amount of fuel. The new 3-dimensional tank model is simulated in Dymola and the results are compared with measures from the model in CATIA and with the results from the simulation of the old 2-dimensional tank model. The results shows that the 3-dimensional tank gives a better approximation of reality and that there is a big improvement compared with the 2-dimensional tank model. The downside is that it takes approximately 24 hours to develop this model. / Att utveckla ett nytt flygplanssystem är en väldigt komplicerad arbetsuppgift. Därför används modeller och simuleringar för att testa icke befintliga system, minska utvecklingstiden och kostnaderna, begränsa riskerna samt upptäcka problem tidigt och på så sätt minska andelen implementerade fel. Vid sektionen Vehicle Simulation and Thermal Analysis på Saab Aeronautics i Linköping designas och simuleras varje grundflygplanssystem, ett av dessa system är bränslesystemet. För närvarande används 2-dimensionella rätblock i simuleringsmodellen för att representera bränsletankarna, vilket är en väldigt grov approximation. För att kunna utföra mer detaljerade analyser behöver modellerna utökas med en bättre geometrisk beskrivning av bränsletankarna. Denna rapport går igenom de olika stegen i den framtagna metodiken för att kombinera 3- dimensionella tankmodeller skapade i CATIA med dynamisk simulering av bränslesystemet i Dymola. Den nya 3-dimensionella representationen av en tank i Dymola bör kunna beräkna bränsleytans läge under en simulering av ett manövrerande flygplan. Första steget i metodiken är att skapa en solid modell av bränslet som finns i tanken. Därefter specificeras modellens giltighetsområde och alla tänkbara riktningar hos accelerationsvektorn som påverkar bränslet genereras, dessa används sedan i den automatiserade volymanalysen i CATIA.  För varje riktning delar CATIA upp bränslemodellen i ett bestämt antal delar och registrerar volymen, bränsleytans läge samt tyngdpunktens position för varje del. Med hjälp av radiala basfunktioner som har implementerats i MATLAB approximeras dessa data och en surrogatmodell tas fram, denna implementeras sedan i Dymola. På så sätt kan bränsleytans och tyngdpunktens läge beräknas på ett effektivt sätt, baserat på riktningen hos bränslets accelerationsvektor samt mängden bränsle i tanken. Den nya 3-dimensionella tankmodellen simuleras i Dymola och resultaten jämförs med mätningar utförda i CATIA samt med resultaten från den gamla simuleringsmodellen. Resultaten visar att den 3-dimensionella tankmodellen ger en mycket bättre representation av verkligheten och att det är en stor förbättring jämfört med den 2-dimensionella representationen. Nackdelen är att det tar ungefär 24 timmar att få fram denna 3-dimensionella representation.
9

Active Machine Learning for Computational Design and Analysis under Uncertainties

Lacaze, Sylvain January 2015 (has links)
Computational design has become a predominant element of various engineering tasks. However, the ever increasing complexity of numerical models creates the need for efficient methodologies. Specifically, computational design under uncertainties remains sparsely used in engineering settings due to its computational cost. This dissertation proposes a coherent framework for various branches of computational design under uncertainties, including model update, reliability assessment and reliability-based design optimization. Through the use of machine learning techniques, computationally inexpensive approximations of the constraints, limit states, and objective functions are constructed. Specifically, a novel adaptive sampling strategy allowing for the refinement of any approximation only in relevant regions has been developed, referred to as generalized max-min. This technique presents various computational advantages such as ease of parallelization and applicability to any metamodel. Three approaches tailored for computational design under uncertainties are derived from the previous approximation technique. An algorithm for reliability assessment is proposed and its efficiency is demonstrated for different probabilistic settings including dependent variables using copulas. Additionally, the notion of fidelity map is introduced for model update settings with large number of dependent responses to be matched. Finally, a new reliability-based design optimization method with local refinement has been developed. A derivation of sampling-based probability of failure derivatives is also provided along with a discussion on numerical estimates. This derivation brings additional flexibility to the field of computational design. The knowledge acquired and techniques developed during this Ph.D. have been synthesized in an object-oriented MATLAB toolbox. The help and ergonomics of the toolbox have been designed so as to be accessible by a large audience.
10

A representation method for large and complex engineering design datasets with sequential outputs

Iwata, Curtis 13 January 2014 (has links)
This research addresses the problem of creating surrogate models of high-level operations and sustainment (O&S) simulations with time sequential (TS) outputs. O&S is a continuous process of using and maintaining assets such as a fleet of aircraft, and the infrastructure to support this process is the O&S system. To track the performance of the O&S system, metrics such as operational availability are recorded and reported as a time history. Modeling and simulation (M&S) is often used as a preliminary tool to study the impact of implementing changes to O&S systems such as investing in new technologies and changing the inventory policies. A visual analytics (VA) interface is useful to navigate the data from the M&S process so that these options can be compared, and surrogate modeling enables some key features of the VA interface such as interpolation and interactivity. Fitting a surrogate model is difficult to TS data because of its size and nonlinear behavior. The Surrogate Modeling and Regression of Time Sequences (SMARTS) methodology was proposed to address this problem. An intermediate domain Z was calculated from the simulation output data in a way that a point in Z corresponds to a unique TS shape or pattern. A regression was then fit to capture the entire range of possible TS shapes using Z as the inputs, and a separate regression was fit to transform the inputs into the Z. The method was tested on output data from an O&S simulation model and compared against other regression methods for statistical accuracy and visual consistency. The proposed methodology was shown to be conditionally better than the other methodologies.

Page generated in 0.087 seconds