• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 3
  • 1
  • Tagged with
  • 13
  • 7
  • 6
  • 6
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Parametric Designs and Weight Optimization using Direct and Indirect Aero-structure Load Transfer Methods

Gandhi, Viraj D. 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Within the aerospace design, analysis and optimization community, there is an increasing demand to finalize the preliminary design phase of the wing as quickly as possible without losing much on accuracy. This includes rapid generation of designs, an early adaption of higher fidelity models and automation in structural analysis of the internal structure of the wing. To perform the structural analysis, the aerodynamic load can be transferred to the wing using many different methods. Generally, for preliminary analysis, indirect load transfer method is used and for detailed analysis, direct load transfer method is used. For the indirect load transfer method, load is discretized using shear-moment-torque (SMT) curve and applied to ribs of the wing. For the direct load transfer method, the load is distributed using one-way Fluid-Structure Interaction (FSI) and applied to the skin of the wing. In this research, structural analysis is performed using both methods and the nodal displacement is compared. Further, to optimize the internal structure, iterative changes are made in the number of structural members. To accommodate these changes in geometry as quickly as possible, the parametric design method is used through Engineering SketchPad (ESP). ESP can also provide attributions the geometric feature and generate multi-fidelity models consistently. ESP can generate the Nastran mesh file (.bdf) with the nodes and the elements grouped according to their geometric attributes. In this research, utilizing the attributions and consistency in multi-fidelity models an API is created between ESP and Nastran to automatize the multi-fidelity structural optimization. This API generates the design with appropriate parameters and mesh file using ESP. Through the attribution in the mesh file, the API works as a pre-processor to apply material properties, boundary condition, and optimization parameters. The API sends the mesh file to Nastran and reads the results file to iterate the number of the structural member in design. The result file is also used to transfer the nodal deformation from lower-order fidelity structural models onto the higher-order ones to have multi-fidelity optimization. Here, static structural optimization on the whole wing serves as lower fidelity model and buckling optimization on each stiffened panel serves as higher fidelity model. To further extend this idea, a parametric model of the whole aircraft is also created. / 2021-08-17
2

Combining Data-driven and Theory-guided Models in Ensemble Data Assimilation

Popov, Andrey Anatoliyevich 23 August 2022 (has links)
There once was a dream that data-driven models would replace their theory-guided counterparts. We have awoken from this dream. We now know that data cannot replace theory. Data-driven models still have their advantages, mainly in computational efficiency but also providing us with some special sauce that is unreachable by our current theories. This dissertation aims to provide a way in which both the accuracy of theory-guided models, and the computational efficiency of data-driven models can be combined. This combination of theory-guided and data-driven allows us to combine ideas from a much broader set of disciplines, and can help pave the way for robust and fast methods. / Doctor of Philosophy / As an illustrative example take the problem of predicting the weather. Typically a supercomputer will run a model several times to generate predictions few days into the future. Sensors such as those on satellites will then pick up observations about a few points on the globe, that are not representative of the whole atmosphere. These observations are combined, ``assimilated'' with the computer model predictions to create a better representation of our current understanding of the state of the earth. This predict-assimilate cycle is repeated every day, and is called (sequential) data assimilation. The prediction step traditional was performed by a computer model that was based on rigorous mathematics. With the advent of big-data, many have wondered if models based purely on data would take over. This has not happened. This thesis is concerned with taking traditional mathematical models and running them alongside data-driven models in the prediction step, then building a theory in which both can be used in data assimilation at the same time in order to not have a drop in accuracy and have a decrease in computational cost.
3

Multi-fidelity, Multidisciplinary Design Analysis and Optimization of the Efficient Supersonic Air Vehicle

Lickenbrock, Madeline Clare January 2020 (has links)
No description available.
4

Couplage optimisation à convergence partielle et stratégie multiparamétrique en calcul de structures / Coupling partially converged data and a multiparametric strategy for the optimization of assemblies

Courrier, Nicolas 08 December 2015 (has links)
Dans le cadre de calcul des assemblages de structures, les bureaux d'études sont à l'heure actuelle encore limités dans la possibilité de mener des travaux d'optimisation. En effet, la résolution numérique des assemblages nécessite la mise en œuvre de méthodes capables de prendre en compte différents types de non-linéarités (frottement, contact et jeux entre pièces). Le coût de calcul associé à ces méthodes est généralement trop important pour mener une optimisation globale nécessitant un trop grand nombre d'évaluations. Afin de pallier à ce problème, ce travail s'appuie sur une démarche d'optimisation à deux niveaux de modèles. Le premier niveau d'optimisation consiste à la création d'un métamodèle sur lequel est effectué une optimisation globale. Le second niveau d'optimisation consiste à mener à bien une optimisation locale sur le modèle mécanique réel. Cette optimisation locale s'appuie sur les résultats trouvés au premier niveau. Deux outils sont principalement utilisés au cours de cette thèse. Tout d'abord les simulations numériques sont réalisées à l'aide de la méthode LaTIn multiparamétrique qui assure la réduction des temps de calcul associés aux multiples résolutions du problème mécanique. L'autre outil plus largement développé au cours de ce travail s'appuie sur la construction de métamodèles multi-fidélité. En effet, la méthode LaTIn est une méthode de calcul itérative, il est alors possible d'avoir accès à un indicateur d'erreur servant de niveau de convergence pour les différents calculs numériques effectués. La construction de métamodèles multi-fidélité a pour particularité de pouvoir incorporé différentes sources d'informations qui sont dans ce travail dites "totalement convergé" lorsqu'un calcul est effectué à convergence et "partiellement convergé" lorsqu'un calcul est stoppé avant convergence. Différentes méthodes multi-fidélité sont testées dans ce travail sur plusieurs exemples mécaniques afin de déterminer les plus performantes. Deux cas industriels sont également traités. / Optimisation strategies on assembly design are often time expensive on industrial case. The main difficulties are due to the non-linearties of the calculation (contact, friction and gap between pieces). The computation cost can be too expensive to lead a global optimization with a large number of evaluation of the mechanical problem.In order to achieve this kind of optimization problems, this work purposes to use a two-levels models optimization strategy. THe first level is defined thanks to the construction of a metamodel which is used to lead a global optimization. On the second level, a local optimization is used on the real mechanical model thanks to the results got from the first level.Two main tools are used in this work. The first one is the multiparametric LaTIn method which enables to reduce drastically the computational time for solving several similar mechanical assembly design problems. The other tool is the one which is the most developped in this work is the constrcution of multi-fidelty surrogate models. Indeed, the LaTIn method in an iterative method, so it is possible to define an error indicator which can be used as a level of convergence of the calculation. The construction of multi-fidelity metamodels has for particularity to incorpore several kind of information which are named as "totally converged" if the calculation has been converged and "partially converged" if the calculation has been stopped premarturly.Different multi-fidélity methods have been investigated in this work on several mechnaical examples in the aim to define the most performant. Industrial case test are trated in this thesis.In order to achieve this kind of optimization problems with an acceptable computational time, this work propose to use a two-levels model optimization strategy based on two main tools: (1) the multiparametric strategy based on the LaTIn method that enables to reduce significantly the computational time for solving many similar mechanical assembly problems and (2) a cokriging metamodel built using responses and gradients computed by the mechanical solver on few sets of design parameters. The metamodel provides very inexpensive approximate responses of the objective function and it enables to achieve a global optimisation and to obtain the global optimum. The cokriging metamodel was reviewed in detail using analytical test functions and some mechanical benchmarks. The quality of the approximation and the building cost were compared with classical kriging approach. Moreover, a complete study of the multiparametric strategy was proposed using many mechanical benchmarks included many kinds and numbers of design parameters. The performance in term of computational time of the whole optimisation process was illustrated.
5

A Unified, Multifidelity Quasi-Newton Optimization Method with Application to Aero-Structural Design

Bryson, Dean Edward 20 December 2017 (has links)
No description available.
6

PARAMETRIC DESIGNS AND WEIGHT OPTIMIZATION USING DIRECT AND INDIRECT AERO-STRUCTURE LOAD TRANSFER METHODS

Viraj Dipakbhai Gandhi (7033289) 13 August 2019 (has links)
Within the aerospace design, analysis and optimization community, there is an increasing demand to finalize the preliminary design phase of the wing as quickly as possible without losing much on accuracy. This includes rapid generation of designs, an early adaption of higher fidelity models and automation in structural analysis of the internal structure of the wing. To perform the structural analysis, the aerodynamic load can be transferred to the wing using many different methods. Generally, for preliminary analysis, indirect load transfer method is used and for detailed analysis, direct load transfer method is used. For the indirect load transfer method, load is discretized using shear-moment-torque (SMT) curve and applied to ribs of the wing. For the direct load transfer method, the load is distributed using one-way Fluid-Structure Interaction (FSI) and applied to the skin of the wing. In this research, structural analysis is performed using both methods and the nodal displacement is compared. Further, to optimize the internal structure, iterative changes are made in the number of structural members. To accommodate these changes in geometry as quickly as possible, the parametric design method is used through Engineering SketchPad (ESP). ESP can also provide attributions the geometric feature and generate multi-fidelity models consistently. ESP can generate the Nastran mesh file (.bdf) with the nodes and the elements grouped according to their geometric attributes. In this research, utilizing the attributions and consistency in multi-fidelity models an API is created between ESP and Nastran to automatize the multi-fidelity structural optimization. This API generates the design with appropriate parameters and mesh file using ESP. Through the attribution in the mesh file, the API works as a pre-processor to apply material properties, boundary condition, and optimization parameters. The API sends the mesh file to Nastran and reads the results file to iterate the number of the structural member in design. The result file is also used to transfer the nodal deformation from lower-order fidelity structural models onto the higher-order ones to have multi-fidelity optimization. Here, static structural optimization on the whole wing serves as lower fidelity model and buckling optimization on each stiffened panel serves as higher fidelity model. To further extend this idea, a parametric model of the whole aircraft is also created.<br>
7

Personnalisation robuste de modèles 3D électromécaniques du cœur. Application à des bases de données cliniques hétérogènes et longitudinales / Robust personalisation of 3D electromechanical cardiac models. Application to heterogeneous and longitudinal clinical databases

Molléro, Roch 19 December 2017 (has links)
La modélisation cardiaque personnalisée consiste à créer des simulations 3D virtuelles de cas cliniques réels pour aider les cliniciens à prédire le comportement du cœur ou à mieux comprendre certaines pathologies. Dans cette thèse nous illustrons d'abord la nécessité d'une approche robuste d'estimation des paramètres, dans un cas ou l'incertitude dans l'orientation des fibres myocardiques entraîne une incertitude dans les paramètres estimés qui est très large par rapport à leur variabilité physiologique. Nous présentons ensuite une approche originale multi-échelle 0D/3D pour réduire le temps de calcul, basée sur un couplage multi-échelle entre les simulations du modèle 3D et d'une version "0D" réduite de ce modèle. Ensuite, nous dérivons un algorithme rapide de personnalisation multi-échelle pour le modèle 3D. Dans un deuxième temps, nous construisons plus de 140 simulations 3D personnalisées, dans le cadre de deux études impliquant l'analyse longitudinale de la fonction cardiaque : d'une part, l'analyse de l'évolution de cardiomyopathies à long terme, d'autre part la modélisation des changements cardiovasculaires pendant la digestion. Enfin, nous présentons un algorithme pour sélectionner automatiquement des directions observables dans l'espace des paramètres à partir d'un ensemble de mesures, et calculer des probabilités "a priori" cohérentes dans ces directions à partir des valeurs de paramètres dans la population. Cela permet en particulier de contraindre l'estimation de paramètres dans les cas où des mesures sont manquantes. Au final nous présentons des estimations cohérentes de paramètres dans une base de données de 811 cas avec le modèle 0D et 137 cas du modèle 3D. / Personalised cardiac modeling consists in creating virtual 3D simulations of real clinical cases to help clinicians predict the behaviour of the heart, or better understand some pathologies from the estimated values of biophysical parameters. In this work we first motivate the need for a consistent parameter estimation framework, from a case study were uncertainty in myocardial fibre orientation leads to an uncertainty in estimated parameters which is extremely large compared to their physiological variability. To build a consistent approach to parameter estimation, we then tackle the computational complexity of 3D models. We introduce an original multiscale 0D/3D approach for cardiac models, based on a multiscale coupling to approximate outputs of a 3D model with a reduced "0D" version of the same model. Then we derive from this coupling an efficient multifidelity optimisation algorithm for the 3D model. In a second step, we build more than 140 personalised 3D simulations, in the context of two studies involving the longitudinal analysis of the cardiac function: on one hand the analysis of long-term evolution of cardiomyopathies under therapy, on the other hand the modeling of short-term cardiovascular changes during digestion. Finally we present an algorithm to automatically detect and select observable directions in the parameter space from a set of measurements, and compute consistent population-based priors probabilities in these directions, which can be used to constrain parameter estimation for cases where measurements are missing. This enables consistent parameter estimations in a large databases of 811 cases with the 0D model, and 137 cases of the 3D model.
8

Métamodélisation et optimisation de dispositifs photoniques / Metamodeling and optimization of photonics devices

Durantin, Cédric 28 May 2018 (has links)
La simulation numérique est couramment utilisée pour étudier le comportement d’un composant et optimiser sa conception. Pour autant, chaque calcul est souvent coûteux en termes de temps et l’optimisation nécessite de résoudre un grand nombre de fois le modèle numérique pour différentes configurations du composant. Une solution actuelle pour réduire le temps de calcul consiste à remplacer la simulation coûteuse par un métamodèle. Des stratégies sont ensuite mises en place pour réaliser l’optimisation du composant à partir du métamodèle. Dans le cadre de cette thèse, trois dispositifs représentatifs des applications pouvant être traitées au sein du CEA LETI sont identifiés. L’étude de ces cas permet d’établir deux problématiques à résoudre. La première concerne la métamodélisation multi-fidélité, qui consiste à construire un métamodèle à partir de deux simulations du même composant ayant une précision différente. Les simulations sont obtenues à partir de différentes approximations du phénomène physique et aboutissent à un modèle appelé haute-fidélité (précis et coûteux) et un modèle basse fidélité (grossier et rapide à évaluer). Le travail sur cette méthode pour le cas de la cellule photoacoustique a amené au développement d’un nouveau métamodèle multifidélité basé sur les fonctions à base radiale. La deuxième problématique concerne la prise en compte des incertitudes de fabrication dans la conception de dispositifs photoniques. L’optimisation des performances de composants en tenant compte des écarts observés entre la géométrie désirée et la géométrie obtenue en fabrication a nécessité le développement d’une méthode spécifique pour le cas du coupleur adiabatique. / Numerical simulation is widely employed in engineering to study the behavior of a device and optimize its design. Nevertheless, each computation is often time consuming and, during an optimization sequence, the simulation code is evaluated a large number of times. An interesting way to reduce the computational burden is to build a metamodel (or surrogate model) of the simulation code. Adaptive strategies are then set up for the optimization of the component using the metamodel prediction. In the context of this thesis, three representative devices are identified for applications that can be encountered within the CEA LETI optics and photonics department. The study of these cases resulted in two problems to be treated. The first one concerns multifidelity metamodeling, which consists of constructing a metamodel from two simulations of the same component that can be hierarchically ranked in accuracy. The simulations are obtained from different approximations of the physical phenomenon. The work on this method for the case of the photoacoustic cell has generated the development of a new multifidelity surrogate model based on radial basis function. The second problem relate to the consideration of manufacturing uncertainties in the design of photonic devices. Taking into account the differences observed between the desired geometry and the geometry obtained in manufacturing for the optimization of the component efficiency requires the development of a particular method for the case of the adiabatic coupler. The entire work of this thesis is capitalized in a software toolbox.
9

Incorporation of Physics-Based Controllability Analysis in Aircraft Multi-Fidelity MADO Framework

Meckstroth, Christopher January 2019 (has links)
No description available.
10

Modèles de substitution spatio-temporels et multifidélité : Application à l'ingénierie thermique / Spatio-temporal and multifidelity surrogate models : Application in thermal engineering

De lozzo, Matthias 03 December 2013 (has links)
Cette thèse porte sur la construction de modèles de substitution en régimes transitoire et permanent pour la simulation thermique, en présence de peu d'observations et de plusieurs sorties.Nous proposons dans un premier temps une construction robuste de perceptron multicouche bouclé afin d'approcher une dynamique spatio-temporelle. Ce modèle de substitution s'obtient par une moyennisation de réseaux de neurones issus d'une procédure de validation croisée, dont le partitionnement des observations associé permet d'ajuster les paramètres de chacun de ces modèles sur une base de test sans perte d'information. De plus, la construction d'un tel perceptron bouclé peut être distribuée selon ses sorties. Cette construction est appliquée à la modélisation de l'évolution temporelle de la température en différents points d'une armoire aéronautique.Nous proposons dans un deuxième temps une agrégation de modèles par processus gaussien dans un cadre multifidélité où nous disposons d'un modèle d'observation haute-fidélité complété par plusieurs modèles d'observation de fidélités moindres et non comparables. Une attention particulière est portée sur la spécification des tendances et coefficients d'ajustement présents dans ces modèles. Les différents krigeages et co-krigeages sont assemblés selon une partition ou un mélange pondéré en se basant sur une mesure de robustesse aux points du plan d'expériences les plus fiables. Cette approche est employée pour modéliser la température en différents points de l'armoire en régime permanent.Nous proposons dans un dernier temps un critère pénalisé pour le problème de la régression hétéroscédastique. Cet outil est développé dans le cadre des estimateurs par projection et appliqué au cas particulier des ondelettes de Haar. Nous accompagnons ces résultats théoriques de résultats numériques pour un problème tenant compte de différentes spécifications du bruit et de possibles dépendances dans les observations. / This PhD thesis deals with the construction of surrogate models in transient and steady states in the context of thermal simulation, with a few observations and many outputs.First, we design a robust construction of recurrent multilayer perceptron so as to approach a spatio-temporal dynamic. We use an average of neural networks resulting from a cross-validation procedure, whose associated data splitting allows to adjust the parameters of these models thanks to a test set without any information loss. Moreover, the construction of this perceptron can be distributed according to its outputs. This construction is applied to the modelling of the temporal evolution of the temperature at different points of an aeronautical equipment.Then, we proposed a mixture of Gaussian process models in a multifidelity framework where we have a high-fidelity observation model completed by many observation models with lower and no comparable fidelities. A particular attention is paid to the specifications of trends and adjustement coefficients present in these models. Different kriging and co-krigings models are put together according to a partition or a weighted aggregation based on a robustness measure associated to the most reliable design points. This approach is used in order to model the temperature at different points of the equipment in steady state.Finally, we propose a penalized criterion for the problem of heteroscedastic regression. This tool is build in the case of projection estimators and applied with the Haar wavelet. We also give some numerical results for different noise specifications and possible dependencies in the observations.

Page generated in 0.0756 seconds