• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 3
  • 2
  • Tagged with
  • 23
  • 23
  • 13
  • 11
  • 9
  • 8
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Mechanical and Electromagnetic Optimization of Structurally Embedded Waveguide Antennas

Albertson, Nicholas James 29 January 2018 (has links)
Use of Slotted Waveguide Antenna Stiffened Structures (SWASS) in future commercial and military aircraft calls for the development of an airworthiness certification procedure. The first step of this procedure is to provide a computationally low-cost method for modeling waveguide antenna arrays on the scale of an aircraft skin panel using a multi-fidelity model. Weather detection radar for the Northrop Grumman X-47 unmanned air system is considered as a case study. COMSOL Multiphysics is used for creating high-fidelity waveguide models that are imported into the MATLAB Phased Array Toolbox for large-scale array calculations using a superposition method. Verification test cases show that this method is viable for relatively accurate modeling of large SWASS arrays with low computational effort. Additionally, realistic material properties for carbon fiber reinforced plastic (CFRP) are used to create a more accurate model. Optimization is performed on a 12-slot CFRP waveguide to determine the waveguide dimensions for the maximum far-field gain and separately for the maximum critical buckling load. Using the two separate optima as utopia points, a multi-objective optimization for the peak far-field gain and critical buckling load is performed, to obtain a balance between EM performance and structural strength. This optimized waveguide is then used to create a SWASS array of approximately the same size as an aircraft wing panel using the multi-fidelity modeling method that is proposed. This model is compared to a typical conventional weather radar system, and found to be well above the minimum mission requirements. / Master of Science
2

Utilisation de simulateurs multi-fidélité pour les études d'incertitudes dans les codes de caclul / Assessment of uncertainty in computer experiments when working with multifidelity simulators.

Zertuche, Federico 08 October 2015 (has links)
Les simulations par ordinateur sont un outil de grande importance pour les mathématiciens appliqués et les ingénieurs. Elles sont devenues plus précises mais aussi plus compliquées. Tellement compliquées, que le temps de lancement par calcul est prohibitif. Donc, plusieurs aspects de ces simulations sont mal compris. Par exemple, souvent ces simulations dépendent des paramètres qu'ont une valeur inconnue.Un metamodèle est une reconstruction de la simulation. Il produit des réponses proches à celles de la simulation avec un temps de calcul très réduit. Avec ce metamodèle il est possible d'étudier certains aspects de la simulation. Il est construit avec peu de données et son objectif est de remplacer la simulation originale.Ce travail est concerné avec la construction des metamodèles dans un cadre particulier appelé multi-fidélité. En multi-fidélité, le metamodèle est construit à partir des données produites par une simulation objective et des données qu'ont une relation avec cette simulation. Ces données approximées peuvent être générés par des versions dégradées de la simulation ; par des anciennes versions qu'ont été largement étudiées ou par une autre simulation dans laquelle une partie de la description est simplifiée.En apprenant la différence entre les données il est possible d'incorporer l'information approximée et ce ci peut nous conduire vers un metamodèle amélioré. Deux approches pour atteindre ce but sont décrites dans ce manuscrit : la première est basée sur des modèles avec des processus gaussiens et la seconde sur une décomposition à base d'ondelettes. La première montre qu'en estimant la relation il est possible d'incorporer des données qui n'ont pas de valeur autrement. Dans la seconde, les données sont ajoutées de façon adaptative pour améliorer le metamodèle.L'objet de ce travail est d'améliorer notre compréhension sur comment incorporer des données approximées pour produire des metamodèles plus précis. Travailler avec un metamodèle multi-fidélité nous aide à comprendre en détail ces éléments. A la fin une image globale des parties qui forment ce metamodèle commence à s'esquisser : les relations et différences entres les données deviennent plus claires. / A very important tool used by applied mathematicians and engineers to model the behavior of a system are computer simulations. They have become increasingly more precise but also more complicated. So much, that they are very slow to produce an output and thus difficult to sample so that many aspects of these simulations are not very well understood. For example, in many cases they depend on parameters whose value isA metamodel is a reconstruction of the simulation. It requires much less time to produce an output that is close to what the simulation would. By using it, some aspects of the original simulation can be studied. It is built with very few samples and its purpose is to replace the simulation.This thesis is concerned with the construction of a metamodel in a particular context called multi-fidelity. In multi-fidelity the metamodel is constructed using the data from the target simulation along other samples that are related. These approximate samples can come from a degraded version of the simulation; an old version that has been studied extensively or a another simulation in which a part of the description is simplified.By learning the difference between the samples it is possible to incorporate the information of the approximate data and this may lead to an enhanced metamodel. In this manuscript two approaches that do this are studied: one based on Gaussian process modeling and another based on a coarse to fine Wavelet decomposition. The fist method shows how by estimating the relationship between two data sets it is possible to incorporate data that would be useless otherwise. In the second method an adaptive procedure to add data systematically to enhance the metamodel is proposed.The object of this work is to better our comprehension of how to incorporate approximate data to enhance a metamodel. Working with a multi-fidelity metamodel helps us to understand in detail the data that nourish it. At the end a global picture of the elements that compose it is formed: the relationship and the differences between all the data sets become clearer.
3

Adaptive Multi-Fidelity Modeling for Efficient Design Exploration Under Uncertainty.

Beachy, Atticus J. 28 August 2020 (has links)
No description available.
4

Multi-Fidelity Structural Modeling For Set Based Design of Advanced Marine Vehicles

Raj, Oliver Neal 22 May 2018 (has links)
This thesis demonstrates that a parametrically-modifiable Advanced Marine Vehicle Structural (AMVS) module (that can be integrated into a larger framework of marine vehicle analysis modules) enables stakeholders, as a group, to complete structurally feasible ship designs using the Set-Based Design (SBD) method. The SBD method allows stakeholders to identify and explore multiple solutions to stakeholder requirements and only eliminating the infeasible poorer solutions after all solutions are completely explored. SBD offers the and advantage over traditional design methods such as Waterfall and Spiral because traditional methods do not adequately explore the design space to determine if they are eliminating more optimal solutions in terms of cost, risk and performance. The fundamental focus for this thesis was on the development of a parametrically modifiable AMVS module using a low-fidelity structural analysis method implemented using a numerical 2D Finite Element Analysis (FEA) applied to the HY2-SWATH. To verify the AMVS module accuracy, a high-fidelity structural analysis was implemented in MAESTRO to analyze the reference marine vehicle model and provide a comparison baseline. To explore the design space, the AMVS module is written to be parametrically modified through input variables, effectively generating a new vessel structure when an input is changed. AMVS module is used to analyze an advanced marine vessel in its two operating modes: displacement and foil-borne. AMVS demonstrates the capability to explore the design space and evaluate the structural feasibility of the advance marine vehicle designs through consideration of the material, stiffener/girder dimensions, stiffener/girder arrangement, and machinery/equipment weights onboard. / Master of Science
5

Metamodel-based collaborative optimization framework

Zadeh, Parviz M., Toropov, V.V., Wood, Alastair S. January 2009 (has links)
This paper focuses on the metamodel-based collaborative optimization (CO). The objective is to improve the computational efficiency of CO in order to handle multidisciplinary design optimization problems utilising high fidelity models. To address these issues, two levels of metamodel building techniques are proposed: metamodels in the disciplinary optimization are based on multi-fidelity modelling (the interaction of low and high fidelity models) and for the system level optimization a combination of a global metamodel based on the moving least squares method and trust region strategy is introduced. The proposed method is demonstrated on a continuous fiber-reinforced composite beam test problem. Results show that methods introduced in this paper provide an effective way of improving computational efficiency of CO based on high fidelity simulation models.
6

Multi-fidelity Gaussian process regression for computer experiments

Le Gratiet, Loic 04 October 2013 (has links) (PDF)
This work is on Gaussian-process based approximation of a code which can be run at different levels of accuracy. The goal is to improve the predictions of a surrogate model of a complex computer code using fast approximations of it. A new formulation of a co-kriging based method has been proposed. In particular this formulation allows for fast implementation and for closed-form expressions for the predictive mean and variance for universal co-kriging in the multi-fidelity framework, which is a breakthrough as it really allows for the practical application of such a method in real cases. Furthermore, fast cross validation, sequential experimental design and sensitivity analysis methods have been extended to the multi-fidelity co-kriging framework. This thesis also deals with a conjecture about the dependence of the learning curve (ie the decay rate of the mean square error) with respect to the smoothness of the underlying function. A proof in a fairly general situation (which includes the classical models of Gaussian-process based metamodels with stationary covariance functions) has been obtained while the previous proofs hold only for degenerate kernels (ie when the process is in fact finite-dimensional). This result allows for addressing rigorously practical questions such as the optimal allocation of the budget between different levels of codes in the multi-fidelity framework.
7

A MULTI-FIDELITY MODELING AND EXPERIMENTAL TESTBED FOR TESTING & EVALUATION OF LEARNING-BASED SYSTEMS

Atharva Mahesh Sonanis (17123428) 10 October 2023 (has links)
<p dir="ltr">Learning-based systems (LBS) have become essential in various domains, necessitating the development of testing and evaluation (T&E) procedures specifically tailored to address the unique characteristics and challenges of LBS. However, existing frameworks designed for traditional systems do not adequately capture the intricacies of LBS, including their evolving nature, complexity, and susceptibility to adversarial actions. This study advocates for a paradigm shift in T&E, proposing its integration throughout the entire life cycle of LBS, starting from the early stages of development and extending to operations and sustainment. The research objectives focus on exploring innovative approaches for designing LBS-specific T&E strategies, creating an experimental testbed with multi-fidelity modeling capabilities, investigating the optimal degree of test and evaluation required for LBS, and examining the impact of system knowledge access and the delicate balance between T&E activities and data/model rights. These objectives aim to overcome the challenges associated with LBS and contribute to the development of effective testing approaches that assess their capabilities and limitations throughout the life cycle. The proposed experimental testbed will provide a versatile environment for comprehensive testing and evaluation, enabling researchers and practitioners to assess LBS performance across varying levels of complexity. The findings from this study will contribute the development of efficient testing strategies and practical approaches that strike a balance between thorough evaluation and data/model rights. Ultimately, the integration of continuous T&E insights throughout the life cycle of LBS aims to enhance the effectiveness and efficiency of capability delivery by enabling adjustments and improvements at each stage.</p>
8

Machine Learning for Improvement of Ocean Data Resolution for Weather Forecasting and Climatological Research

Huda, Md Nurul 18 October 2023 (has links)
Severe weather events like hurricanes and tornadoes pose major risks globally, underscoring the critical need for accurate forecasts to mitigate impacts. While advanced computational capabilities and climate models have improved predictions, lack of high-resolution initial conditions still limits forecast accuracy. The Atlantic's "Hurricane Alley" region sees most storms arise, thus needing robust in-situ ocean data plus atmospheric profiles to enable precise hurricane tracking and intensity forecasts. Examining satellite datasets reveals radio occultation (RO) provides the most accurate 5-25 km altitude atmospheric measurements. However, below 5 km accuracy remains insufficient over oceans versus land areas. Some recent benchmark study e.g. Patil Iiyama (2022), and Wei Guan (2022) in their work proposed the use of deep learning models for sea surface temperature (SST) prediction in the Tohoku region with very low errors ranging from 0.35°C to 0.75°C and the root-mean-square error increases from 0.27°C to 0.53°C over the over the China seas respectively. The approach we have developed remains unparalleled in its domain as of this date. This research is divided into two parts and aims to develop a data driven satellite-informed machine learning system to combine high-quality but sparse in-situ ocean data with more readily available low-quality satellite data. In the first part of the work, a novel data-driven satellite-informed machine learning algorithm was implemented that combines High-Quality/Low-Coverage in-situ point ocean data (e.g. ARGO Floats) and Low-Quality/High-Coverage Satellite ocean Data (e.g. HYCOM, MODIS-Aqua, G-COM) and generated high resolution data with a RMSE of 0.58◦C over the Atlantic Ocean.The second part of the work a novel GNN algorithm was implemented on the Gulf of Mexico and showed it can successfully capture the complex interactions between the ocean and mimic the path of a ARGO floats with a RMSE of 1.40◦C. / Doctor of Philosophy / Severe storms like hurricanes and tornadoes are a major threat around the world. Accurate weather forecasts can help reduce their impacts. While climate models have improved predictions, lacking detailed initial conditions still limits forecast accuracy. The Atlantic's "Hurricane Alley" sees many storms form, needing good ocean and atmospheric data for precise hurricane tracking and strength forecasts. Studying satellite data shows radio occultation provides the most accurate 5-25 km high altitude measurements over oceans. But below 5 km accuracy remains insufficient versus over land. Recent research proposed using deep learning models for sea surface temperature prediction with low errors. Our approach remains unmatched in this area currently. This research has two parts. First, we developed a satellite-informed machine learning system combining limited high-quality ocean data with more available low-quality satellite data. This generated high resolution Atlantic Ocean data with an error of 0.58°C. Second, we implemented a new algorithm on the Gulf of Mexico, successfully modeling complex ocean interactions and hurricane paths with an error of 1.40°C. Overall, this research advances hurricane forecasting by combining different data sources through innovative machine learning techniques. More accurate predictions can help better prepare communities in hurricane-prone regions.
9

CBAS: A Multi-Fidelity Surrogate Modeling Tool For Rapid Aerothermodynamic Analysis

Tyler Scott Adams (18423228) 23 April 2024 (has links)
<p dir="ltr"> The need to develop reliable hypersonic capabilities is of critical import today. Among the most prominent tools used in recent efforts to overcome the challenges of developing hypersonic vehicles are NASA's Configuration Based Aerodynamics (CBAERO) and surrogate modeling techniques. This work presents the development of a tool, CBAERO Surrogate (CBAS), which leverages the advantages of both CBAERO and surrogate models to create a simple and streamlined method for building an aerodynamic database for any given vehicle geometry. CBAS is capable of interfacing with CBAERO directly and builds Kriging or Co-Kriging surrogate models for key aerodynamic parameters without significant user or computational effort. Two applicable geometries representing hypersonic vehicles have been used within CBAS and the resulting Kriging and Co-Kriging surrogate models evaluated against experimental data. These results show that the Kriging model predictions are accurate to CBAERO's level of fidelity, while the Co-Kriging model predictions fall within 0.5%-5% of the experimental data. These Co-Kriging models produced by CBAS are 10%-50% more accurate than CBAERO and the Kriging models and offer a higher fidelity solution while maintaining low computational expense. Based on these initial results, there are promising advancements to obtain in future work by incorporating CBAS to additional applications.</p>
10

Méthodes avancées d'optimisation par méta-modèles – Applicationà la performance des voiliers de compétition / Advanced surrogate-based optimization methods - Application to racing yachts performance

Sacher, Matthieu 10 September 2018 (has links)
L’optimisation de la performance des voiliers est un problème difficile en raison de la complexité du systèmemécanique (couplage aéro-élastique et hydrodynamique) et du nombre important de paramètres à optimiser (voiles, gréement,etc.). Malgré le fait que l’optimisation des voiliers est empirique dans la plupart des cas aujourd’hui, les approchesnumériques peuvent maintenant devenir envisageables grâce aux dernières améliorations des modèles physiques et despuissances de calcul. Les calculs aéro-hydrodynamiques restent cependant très coûteux car chaque évaluation demandegénéralement la résolution d’un problème non linéaire d’interaction fluide-structure. Ainsi, l’objectif central de cette thèseest de proposer et développer des méthodes originales dans le but de minimiser le coût numérique de l’optimisation dela performance des voiliers. L’optimisation globale par méta-modèles Gaussiens est utilisée pour résoudre différents problèmesd’optimisation. La méthode d’optimisation par méta-modèles est étendue aux cas d’optimisations sous contraintes,incluant de possibles points non évaluables, par une approche de type classification. L’utilisation de méta-modèles à fidélitésmultiples est également adaptée à la méthode d’optimisation globale. Les applications concernent des problèmesd’optimisation originaux où la performance est modélisée expérimentalement et/ou numériquement. Ces différentes applicationspermettent de valider les développements des méthodes d’optimisation sur des cas concrets et complexes, incluantdes phénomènes d’interaction fluide-structure. / Sailing yacht performance optimization is a difficult problem due to the high complexity of the mechanicalsystem (aero-elastic and hydrodynamic coupling) and the large number of parameters to optimize (sails, rigs, etc.).Despite the fact that sailboats optimization is empirical in most cases today, the numerical optimization approach is nowconsidered as possible because of the latest advances in physical models and computing power. However, these numericaloptimizations remain very expensive as each simulation usually requires solving a non-linear fluid-structure interactionproblem. Thus, the central objective of this thesis is to propose and to develop original methods aiming at minimizing thenumerical cost of sailing yacht performance optimization. The Efficient Global Optimization (EGO) is therefore appliedto solve various optimization problems. The original EGO method is extended to cases of optimization under constraints,including possible non computable points, using a classification-based approach. The use of multi-fidelity surrogates isalso adapted to the EGO method. The applications treated in this thesis concern the original optimization problems inwhich the performance is modeled experimentally and/or numerically. These various applications allow for the validationof the developments in optimization methods on real and complex problems, including fluid-structure interactionphenomena.

Page generated in 0.038 seconds