• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 31
  • 8
  • 5
  • 4
  • 1
  • 1
  • 1
  • Tagged with
  • 66
  • 20
  • 19
  • 18
  • 17
  • 16
  • 13
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Supporting Automatic Interoperability in Model-Driven Development Processes

Giachetti Herrera, Giovanni Andrés 04 July 2011 (has links)
By analyzing the last years of software development evolution, it is possible to observe that the involved technologies are increasingly focused on the definition of models for the specification of the intended software products. This model-centric development schema is the main ingredient for the Model-Driven Development (MDD) paradigm. In general terms, the MDD approaches propose the automatic generation of software products by means of the transformation of the defined models into the final program code. This transformation process is also known as model compilation process. Thus, MDD is oriented to reduce (or even eliminate) the hand-made programming, which is an error-prone and time-consuming task. Hence, models become the main actors of the MDD processes: the models are the new programming code. In this context, the interoperability can be considered a natural trend for the future of model-driven technologies, where different modeling approaches, tools, and standards can be integrated and coordinated to reduce the implementation and learning time of MDD solutions as well as to improve the quality of the final software products. However, there is a lack of approaches that provide a suitable solution to support the interoperability in MDD processes. Moreover, the proposals that define an interoperability framework for MDD processes are still in a theoretical space and are not aligned with current standards, interoperability approaches, and technologies. Thus, the main objective of this doctoral thesis is to develop an approach to achieve the interoperability in MDD processes. This interoperability approach is based on current metamodeling standards, modeling language customization mechanisms, and model-to-model transformation technologies. To achieve this objective, novel approaches have been defined to improve the integration of modeling languages, to obtain a suitable interchange of modeling information, and to perform automatic interoperability verification. / Giachetti Herrera, GA. (2011). Supporting Automatic Interoperability in Model-Driven Development Processes [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/11108 / Palancia
42

Optimalizace tvaru mazací mezery hydrodynamického ložiska s využitím metamodelování / Lubricant gap shape optimization of the hydrodynamic thrust bearing using the metamodeling

Přibyl, Jan January 2019 (has links)
The aim of my diploma thesis was to optimize the gap shape of a turbocharger thrust (axial) bearing using the metamodeling. In its first part, the thesis focuses on introduction of optimization and metamodeling, description of different metamodeling techniques and description of a turbocharger with focus on lubrication of thrust bearing. The second part contains a calculation model of flow through the lubrication gap, use of the techniques for compiling a metamodel and the evaluation of individual techniques. Specifically, the methods used are response surfaces and kriging.
43

Modélisation intégrée produit-process à l'aide d'une approche de métamodélisation reposant sur une représentation sous forme de graphes : Application à la fabrication additive / Product-process integrated meta-modeling using a graph-based approach : Application to additive manufacturing

Mokhtarian, Hossein 27 March 2019 (has links)
La fabrication additive (FA) a initié un changement de paradigme dans le secteur de la conception et de la fabrication des produits grâce à ses capacités uniques. Cependant, l'intégration des technologies de fabrication additive dans la productique traditionnelle doit permettre d'assurer une production fiable et une qualité reproductible des pièces. Dans cette optique, la modélisation et la simulation jouent un rôle essentiel pour améliorer la compréhension de la nature complexe et multi-physique des procédés de fabrication additive. De plus, l’intégration simultanée de différents modèles multi-physiques et de la prise en compte du procédé utilisé et de la pièce constituent toujours un défi pour la modélisation de ces technologies. L’objectif final de cette recherche est de développer et d’appliquer une approche de modélisation permettant une modélisation intégrée de la fabrication additive. Cette thèse analyse le processus de développement du produit et présente une méthodologie innovante intitulée ‘Dimensional Analysis Conceptual Modeling’ (DACM) pour modéliser les produits et les procédés de fabrication aux différentes étapes de conception. La méthode a été développée pour permettre la simulation de modèles multi-physiques. Elle intègre également une recherche systématique de faiblesses et de contradictions dans une première évaluation des solutions potentielles au problème. La méthodologie développée est appliquée dans plusieurs études de cas afin de présenter des modèles intégrant les processus de fabrication additive et les pièces à fabriquer. Les résultats montrent que la méthodologie DACM permet de modéliser distinctement et simultanément le produit et le processus de fabrication. Cette méthodologie permet aussi d'intégrer les modèles théoriques et expérimentaux déjà existants. Elle contribue à la conception pour la fabrication additive et aide le concepteur à anticiper les limites des procédés et de la conception plus tôt dans les premières étapes de développement du produit. En particulier, cela permet de prendre les bonnes décisions selon les différentes possibilités d'optimiser la conception des pièces et le paramétrage des machines de fabrication additive pour aboutir à la solution la plus adaptée. La méthode permet également de détecter la nécessité de reconcevoir des machines existantes en détectant les faiblesses de celles-ci. Cette thèse montre que la méthode DACM peut être potentiellement utilisée comme une approche de méta-modélisation pour la fabrication additive.Mots-clés: Fabrication Additive, Conception Pour la Fabrication Additive, Modélisation Intégrée, Développement de Produit, Dimensional Analysis Conceptual Modeling Framework / Additive manufacturing (AM) has created a paradigm shift in product design and manufacturing sector due to its unique capabilities. However, the integration of AM technologies in the mainstream production faces the challenge of ensuring reliable production and repeatable quality of parts. Toward this end, Modeling and simulation play a significant role to enhance the understanding of the complex multi-physics nature of AM processes. In addition, a central issue in modeling AM technologies is the integration of different models and concurrent consideration of the AM process and the part to be manufactured. Hence, the ultimate goal of this research is to present and apply a modeling approach to develop integrated modeling in additive manufacturing. Accordingly, the thesis oversees the product development process and presents the Dimensional Analysis Conceptual Modeling (DACM) Framework to model the product and manufacturing processes at the design stages of product development process. The Framework aims at providing simulation capabilities and systematic search for weaknesses and contradictions to the models for the early evaluation of solution variants. The developed methodology is applied in multiple case studies to present models integrating AM processes and the parts to be manufactured. This thesis results show that the proposed modeling framework is not only able to model the product and manufacturing process but also provide the capability to concurrently model product and manufacturing process, and also integrate existing theoretical and experimental models. The DACM framework contributes to the design for additive manufacturing and helps the designer to anticipate limitations of the AM process and part design earlier in the design stage. In particular, it enables the designer to make informed decisions on potential design alterations and AM machine redesign, and optimized part design or process parameter settings. DACM Framework shows potentials to be used as a metamodeling approach for additive manufacturing.
44

Uncertainty quantification in the simulation of road traffic and associated atmospheric emissions in a metropolitan area / Quantification d'incertitude en simulation du trafic routier et de ses émissions atmosphériques à l'échelle métropolitaine

Chen, Ruiwei 25 May 2018 (has links)
Ce travail porte sur la quantification d'incertitude dans la modélisation des émissions de polluants atmosphériques dues au trafic routier d'une aire urbaine. Une chaîne de modélisations des émissions de polluants atmosphériques est construite, en couplant un modèle d’affectation dynamique du trafic (ADT) avec un modèle de facteurs d’émission. Cette chaîne est appliquée à l’agglomération de Clermont-Ferrand (France) à la résolution de la rue. Un métamodèle de l’ADT est construit pour réduire le temps d’évaluation du modèle. Une analyse de sensibilité globale est ensuite effectuée sur cette chaîne, afin d’identifier les entrées les plus influentes sur les sorties. Enfin, pour la quantification d’incertitude, deux ensembles sont construits avec l’approche de Monte Carlo, l’un pour l’ADT et l’autre pour les émissions. L’ensemble d’ADT est évalué et amélioré grâce à la comparaison avec les débits du trafic observés, afin de mieux échantillonner les incertitudes / This work focuses on the uncertainty quantification in the modeling of road traffic emissions in a metropolitan area. The first step is to estimate the time-dependent traffic flow at street-resolution for a full agglomeration area, using a dynamic traffic assignment (DTA) model. Then, a metamodel is built for the DTA model set up for the agglomeration, in order to reduce the computational cost of the DTA simulation. Then the road traffic emissions of atmospheric pollutants are estimated at street resolution, based on a modeling chain that couples the DTA metamodel with an emission factor model. This modeling chain is then used to conduct a global sensitivity analysis to identify the most influential inputs in computed traffic flows, speeds and emissions. At last, the uncertainty quantification is carried out based on ensemble simulations using Monte Carlo approach. The ensemble is evaluated with observations in order to check and optimize its reliability
45

Desenvolvimento de máquinas de execução para linguagens de modelagem específicas de domínio: uma estratégia baseada em engenharia dirigida por modelos / Model-driven development of domain - specific execution engines

Sousa, Gustavo Cipriano Mota 09 October 2012 (has links)
Submitted by Marlene Santos (marlene.bc.ufg@gmail.com) on 2016-03-22T17:53:33Z No. of bitstreams: 2 Dissertação - Gustavo Cipriano Mota Sousa - 2012.pdf: 2362932 bytes, checksum: 554bee516fc979b416ec8ff1b253e521 (MD5) license_rdf: 19874 bytes, checksum: 38cb62ef53e6f513db2fb7e337df6485 (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2016-03-23T14:15:40Z (GMT) No. of bitstreams: 2 Dissertação - Gustavo Cipriano Mota Sousa - 2012.pdf: 2362932 bytes, checksum: 554bee516fc979b416ec8ff1b253e521 (MD5) license_rdf: 19874 bytes, checksum: 38cb62ef53e6f513db2fb7e337df6485 (MD5) / Made available in DSpace on 2016-03-23T14:15:40Z (GMT). No. of bitstreams: 2 Dissertação - Gustavo Cipriano Mota Sousa - 2012.pdf: 2362932 bytes, checksum: 554bee516fc979b416ec8ff1b253e521 (MD5) license_rdf: 19874 bytes, checksum: 38cb62ef53e6f513db2fb7e337df6485 (MD5) Previous issue date: 2012-10-09 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / Fundação de Amparo à Pesquisa do Estado de Goiás - FAPEG / The combination of domain-specific modeling languages and model-driven engineering techniques hold the promise of a breakthrough in the way applications are developed. By raising the level of abstraction and specializing in building blocks that are familiar in a particular domain, it has the potential to turn domain experts into application developers. Applications are developed as models, which in turn are interpreted at runtime by a specialized execution engine in order to produce the intended behavior. In this approach models are processed by domain-specific execution engines that embed knowledge about how to execute the models. This approach has been successfully applied in different domains, such as communication and smart grid management to execute applications described by models that can be created and changed at runtime. However, each time the approach has to be realized in a different domain, substantial re-implementation has to take place in order to put together an execution engine for the respective DSML. In this work, we present a generalization of the approach in the form of a metamodel that captures the domain-independent aspects of runtime model interpretation and allow the definition of a particular class of domain-specific execution engines which provide a highlevel service upon an underlying set of heterogenous set of resources. / Abordagens de engenharia de software dirigida por modelos propõem o uso de modelos como uma forma de lidar com a crescente complexidade das aplicações atuais. Por meio de linguagens de modelagem específicas de domínio, essas abordagens visam elevar o nível de abstração utilizado na engenharia de software, possibilitando que usuários que conheçam o domínio de negócio sejam capazes de construir aplicações. As aplicações são definidas como modelos que são então processados de forma automatizada por mecanismos capazes de executá-los. Essa abordagem tem sido aplicada em domínios como comunicação e redes elétricas inteligentes para possibilitar a construção de aplicações por meio de modelos que podem ser criados e modificados em tempo de execução. Nessa abordagem, modelos são processados por máquinas de execução específicas de domínio, que encapsulam o conhecimento necessário para executá-los. No entanto, a aplicação dessa mesma abordagem em outros domínios exige que novas máquinas de execução sejam implementadas por completo, o que exige um grande esforço de implementação. Neste trabalho, apresentamos uma abordagem dirigida por modelos para a construção dessas máquinas de execução de modelos. Essa abordagem propõe um metamodelo que captura os aspectos independentes de domínio de uma classe particular de máquinas de execução de modelos, os quais descrevem aplicações baseadas no provimento de serviços a partir de um conjunto heterogêneo de recursos. A partir do metamodelo proposto, podem ser construídos modelos que definem máquinas de execução para domínios específicos, as quais são capazes de executar modelos descritos na linguagem de modelagem específica do domínio em questão.
46

Quantification gamma des radionucléides par modélisation équivalente / Gamma ray quantification by equivalent numerical modelling

Guillot, Nicolas 09 March 2015 (has links)
Cette thèse s’inscrit dans le domaine de la métrologie des rayonnements ionisants. Plus particulièrement dans la mesure par spectrométrie gamma des actinides contenus dans les colis et fûts de déchets. Le travail mené consiste à modéliser le coefficient d’étalonnage de la scène de mesure, élément indispensable à la quantification de l’activité (ou à la masse de radionucléides recherchée) de l’objet mesuré. La thèse comporte deux parties. La première partie traite de la modélisation de la réponse numérique spatiale et énergétique équivalente à la réponse réelle du détecteur, étape indispensable pour remonter à l’activité de l’objet. La seconde partie traite de la quantification du coefficient d’étalonnage de la scène de mesure sans hypothèse de l’opérateur. Le premier travail de thèse est la mise au point d’une méthodologie quasi automatisée d’obtention d’une réponse numérique équivalente à la réponse réelle du détecteur à un critère de convergence fixé. La réponse numérique est obtenue, sans expert, en conditions de terrain avec un critère de convergence inférieur à 5%. Le second travail est une étude de faisabilité sur la quantification de l’activité pour des colis complexes sans hypothèse de l’opérateur grâce à l’utilisation de métamodèles. Les métamodèles permettent de générer rapidement un ensemble de configurations du coefficient d’étalonnage par rapport aux données d’entrée. Les configurations sont ensuite triées pour sélectionner le coefficient d'étalonnage correspondant à la scène de mesure. / This thesis deals with radiation measurement. More particularly it concerns gamma ray spectroscopy for low level wastes. It consists in modeling the full efficiency calibration coefficient of the measured scene. It is essential to quantify the activity/mass of the measured object. This thesis is split in two parts. The first part consists in HPGE detector characterization. The HPGe characterization is available in space and energy range. The second part consists in determining the full efficiency calibration coefficient of the measured scene without operator hypothesis. First work is the development of an automated methodology to obtain detector characterization. HPGe detector characterization has similar performance to the real detector with a control of the discrepancy between them. HPGe detector characterization is achieved without expert, on field condition with a convergence criterion lower than 5%. Second work is a feasibility study for activity quantification of complex waste package without operator hypothesis. It will be possible by using metamodeling. Metamodeling generate quickly a set of configurations of the calibration coefficient with regard to input data. Configurations are sorted out according some criterions.
47

Design Optimization in Gas Turbines using Machine Learning : A study performed for Siemens Energy AB / Designoptimisering i gasturbiner med hjälp av maskininlärning

Mathias, Berggren, Daniel, Sonesson January 2021 (has links)
In this thesis, the authors investigate how machine learning can be utilized for speeding up the design optimization process of gas turbines. The Finite Element Analysis (FEA) steps of the design process are examined if they can be replaced with machine learning algorithms. The study is done using a component with given constraints that are provided by Siemens Energy AB. With this component, two approaches to using machine learning are tested. One utilizes design parameters, i.e. raw floating-point numbers, such as the height and width. The other technique uses a high dimensional mesh as input. It is concluded that using design parameters with surrogate models is a viable way of performing design optimization while mesh input is currently not. Results from using different amount of data samples are presented and evaluated.
48

From Horns to Helmets: Multi-Objective Design Optimization Considerations to Protect the Brain

Johnson, Kyle Leslie 12 August 2016 (has links)
This dissertation presents an investigation and design optimization of energy absorbent protective systems that protect the brain. Specifically, the energy absorption characteristics of the bighorn sheep skull-horn system were quantified and used to inform a topology optimization performed on a football helmet facemask leading to reduced values of brain injury indicators. The horn keratin of a bighorn sheep was experimentally characterized in different stress states, strain rates, and moisture contents. Horn keratin demonstrated a clear strain rate dependence in both tension and compression. As the strain rate increased, the flow stress increased. Also, increased moisture content decreased the strength and increased ductility. The hydrated horn keratin energy absorption increased at high strain rates when compared to quasi-static data. The keratin experimental data was then used to inform constitutive models employed in the simulation of bighorn sheep head impacts at 5.5 m/s. Accelerations values as high as 607 G’s were observed in finite element simulations for rams butting their heads, which is an order of magnitude higher than predicted brain injury threshold values. In the most extreme case, maximum tensile pressure and maximum shear strains in the ram brain were 245 kPa and 0.28, respectively. These values could serve as true injury metrics for human head impacts. Finally, a helmeted human head Finite Element (FE) model is created, validated, and used to recreate impacts from a linear impactor. The results from these simulations are used to train a surrogate model, which is in turn utilized in multi-objective design optimization. Brain injury indicators were significantly reduced by performing multi-objective design optimization on a football helmet facemask. In particular, the tensile pressure and maximum shear strain in the brain decreased 7.5 % and 39.5 %, respectively when comparing the optimal designs to the baseline design. While the maximum tensile pressure and maximum shear strain values in the brain for helmeted head impacts (30.2 kPa and 0.011) were far less than the ram impacts (245 kPa and 0.28), helmet impacts up to 12.3 m/s have been recorded, and could easily surpass these thresholds.
49

Response Surface Analysis of Trapped-Vortex Augmented Airfoils

Zope, Anup Devidas 11 December 2015 (has links)
In this study, the effect of a passive trapped-vortex cell on lift to drag (L/D) ratio of an FFA-W3-301 airfoil is studied. The upper surface of the airfoil was modified to incorporate a cavity defined by seven parameters. The L/D ratio of the airfoil is modeled using a radial basis function metamodel. This model is used to find the optimal design parameter values that give the highest L/D. The numerical results indicate that the L/D ratio is most sensitive to the position on an airfoil’s upper surface at which the cavity starts, the position of the end point of the cavity, and the vertical distance of the cavity end point relative to the airfoil surface. The L/D ratio can be improved by locating the cavity start point at the point of separation for a particular angle of attack. The optimal cavity shape (o19_aXX) is also tested for a NACA0024 airfoil.
50

Uncertainty Quantification in Dynamic Problems With Large Uncertainties

Mulani, Sameer B. 13 September 2006 (has links)
This dissertation investigates uncertainty quantification in dynamic problems. The Advanced Mean Value (AMV) method is used to calculate probabilistic sound power and the sensitivity of elastically supported panels with small uncertainty (coefficient of variation). Sound power calculations are done using Finite Element Method (FEM) and Boundary Element Method (BEM). The sensitivities of the sound power are calculated through direct differentiation of the FEM/BEM/AMV equations. The results are compared with Monte Carlo simulation (MCS). An improved method is developed using AMV, metamodel, and MCS. This new technique is applied to calculate sound power of a composite panel using FEM and Rayleigh Integral. The proposed methodology shows considerable improvement both in terms of accuracy and computational efficiency. In systems with large uncertainties, the above approach does not work. Two Spectral Stochastic Finite Element Method (SSFEM) algorithms are developed to solve stochastic eigenvalue problems using Polynomial chaos. Presently, the approaches are restricted to problems with real and distinct eigenvalues. In both the approaches, the system uncertainties are modeled by Wiener-Askey orthogonal polynomial functions. Galerkin projection is applied in the probability space to minimize the weighted residual of the error of the governing equation. First algorithm is based on inverse iteration method. A modification is suggested to calculate higher eigenvalues and eigenvectors. The above algorithm is applied to both discrete and continuous systems. In continuous systems, the uncertainties are modeled as Gaussian processes using Karhunen-Loeve (KL) expansion. Second algorithm is based on implicit polynomial iteration method. This algorithm is found to be more efficient when applied to discrete systems. However, the application of the algorithm to continuous systems results in ill-conditioned system matrices, which seriously limit its application. Lastly, an algorithm to find the basis random variables of KL expansion for non-Gaussian processes, is developed. The basis random variables are obtained via nonlinear transformation of marginal cumulative distribution function using standard deviation. Results are obtained for three known skewed distributions, Log-Normal, Beta, and Exponential. In all the cases, it is found that the proposed algorithm matches very well with the known solutions and can be applied to solve non-Gaussian process using SSFEM. / Ph. D.

Page generated in 0.0879 seconds