• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 31
  • 8
  • 5
  • 4
  • 1
  • 1
  • 1
  • Tagged with
  • 66
  • 20
  • 19
  • 18
  • 17
  • 16
  • 13
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

An analysis of the feasibility of predictive process control of welding applications using infrared pyrometers and thermal metamodels

Ely, George Ray 27 October 2010 (has links)
Predictive process control (PPC) is the use of predictive, physical models as the basis for process control [1]. In contrast, conventional control algorithms utilize statistical models that are derived from repetitive process trials. PPC employs in-process monitoring and control of manufacturing processes. PPC algorithms are very promising approaches for welding of small lots or customized products with rapid changes in materials, geometry, or processing conditions. They may also be valuable for welding high value products for which repeated trials and waste are not acceptable. In this research, small-lot braze-welding of UNS C22000 commercial bronze with gas metal arc welding (GMAW) technology is selected as a representative application of PPC. Thermal models of the welding process are constructed to predict the effects of changes in process parameters on the response of temperature measurements. Because accurate thermal models are too computationally expensive for direct use in a control algorithm, metamodels are constructed to drastically reduce computational expense while retaining a high degree of accuracy. Then, the feasibility of PPC of welding applications is analyzed with regard to uncertainties and time delays in an existing welding station and thermal metamodels of the welding process. Lastly, a qualitative residual stress model is developed to nondestructively assess weld quality in end-user parts. / text
2

Taylor Kriging metamodeling for simulation interpolation, sensitivity analysis and optimization

Liu, Heping. Maghsoodloo, Saeed, January 2009 (has links)
Dissertation (Ph.D.)--Auburn University, / Abstract. Vita. Includes bibliographic references (p.159-171).
3

Modeling and simulation of flows over and through fibrous porous media

Luminari, Nicola 19 March 2018 (has links) (PDF)
Any natural surface is in essence non-smooth, consisting of more or less regular roughness and/or mobile structures of different scales. From a fluid mechanics point of view, these natural surfaces offer better aerodynamic performances when they cover moving bodies, in terms of drag reduction, lift enhancement or control of boundary layer separation; this has been shown for boundary layer or wake flows around thick bodies. The numerical simulation of microscopic flows around "natural" surfaces is still out of reach today. Therefore, the goal of this thesis is to study the modeling of the apparent flow slip occurring on this kind of surfaces, modeled as a porous medium, applying Whitaker's volume averaging theory. This mathematical model makes it possible to capture details of the microstructure while preserving a satisfactory description of the physical phenomena which occur. The first chapter of this manuscript provides an overview of previous efforts to model these surfaces, detailing the most important results from the literature. The second chapter presents the mathematical derivation of the volume-averaged Navier-Stokes equations (VANS) in a porous medium. In the third chapter the flow stability at the interface between a free fluid and a porous medium, formed by a series of rigid cylinders, is studied. The presence of this porous layer is treated by including a drag term in the fluid equations. It is shown that the presence of this term reduces the rates of amplification of the Kelvin-Helmholtz instability over the whole range of wavenumbers, thus leading to an increase of the wavelength of the most amplified mode. In this same context, the difference between the isotropic model and a tensorial approach for the drag term has been evaluated, to determine the most consistent approach to study these flow instabilities. This has led to the conclusion that the model that uses the apparent permeability tensor is the most relevant one. In the following chapter, based on this last result, the apparent permeability tensor, based on over one hundred direct numerical simulations carried out over microscopic unit cells, has been identified for a three-dimensional porous medium consisting of rigid cylinders. In these configurations the tensor varies according to four parameters: the Reynolds number, the porosity and the direction of the average pressure gradient, defined by two Euler angles. This parameterization makes it possible to capture local three-dimensional effects. This database has been set up to create, based on a kriging-type approach, a behavioral metamodel for estimating all the components of the apparent permeability tensor. In the fifth chapter, simulations of the VANS equations are carried out on a macroscopic scale after the implementation of the metamodel, to get reasonable computing times. The validation of the macroscopic approach is performed on a closed cavity flow covered with a porous layer and a comparison with the results of a very accurate DNS, homogenized a posteriori, has shown a very good agreement and has demonstrated the relevance of the approach. The next step has been the study of the passive control of the separation of the flow past a hump which is placed on a porous wall, by the same macroscopic VANS approach. Finally, general conclusions and possible directions of research in the field are presented in the last chapter.
4

Representation of Multi-Level Domains on The Web

SILVA, F. B. 28 September 2016 (has links)
Made available in DSpace on 2018-08-02T00:03:44Z (GMT). No. of bitstreams: 1 tese_10271_representation_of_multi_level_domains_on_the_web_2016 - freddy.pdf: 1850597 bytes, checksum: 49e1ac6068e9ec186891d6c01f4acab9 (MD5) Previous issue date: 2016-09-28 / Estratégias de modelagem conceitual e representação de conhecimento frequentemente tratam entidades em dois níveis: um nível de classes e um nível de indivíduos que instanciam essas classes. Em vários domínios, porém, as próprias classes podem estar sujeitas a categorização, resultando em classes de classes (ou metaclasses). Ao representar estes domínios, é preciso capturar não apenas as entidades de diferentes níveis de classificação, mas também as suas relações (possivelmente complexas). No domínio de taxonomias biológicas, por exemplo, um dado organismo (por exemplo, o leão Cecil morto em 2015 no Parque Nacional Hwange no Zimbábue) é classificado em diversos táxons (como, por exemplo, Animal, Mamífero, Carnívoro, Leão), e cada um desses táxons é classificado por um ranking taxonômico (por exemplo, Reino, Classe, Ordem, Espécie). Assim, para representar o conhecimento referente a esse domínio, é necessário representar entidades em níveis diferentes de classificação. Por exemplo, Cecil é uma instância de Leão, que é uma instância de Espécie. Espécie, por sua vez, é uma instância de Ranking Taxonômico. Além disso, quando representamos esses domínios, é preciso capturar não somente as entidades diferentes níveis de classificação, mas também suas (possivelmente complicadas) relações. Por exemplo, nós gostaríamos de afirmar que instâncias do gênero Panthera também devem ser instâncias de exatamente uma instância de Espécie (por exemplo, Leão). A necessidade de suporte à representação de domínios que lidam com múltiplos níveis de classificação deu origem a uma área de investigação chamada modelagem multi-nível. Observa-se que a representação de modelos com múltiplos níveis é um desafio em linguagens atuais da Web Semântica, como há pouco apoio para orientar o modelador na produção correta de ontologias multi-nível, especialmente por causa das nuanças de restrições que se aplicam a entidades de diferentes níveis de classificação e suas relações. A fim de lidar com esses desafios de representação, definimos um vocabulário que pode ser usado como base para a definição de ontologias multi-nível em OWL, juntamente com restrições de integridade e regras de derivação. É oferecida uma ferramenta que recebe como entrada um modelo de domínio, verifica conformidade com as restrições de integridade propostas e produz como saída um modelo enriquecido com informações derivadas. Neste processo, é empregada uma teoria axiomática chamada MLT (uma Teoria de Modelagem Multi-Nível). O conteúdo da plataforma Wikidata foi utilizado para demonstrar que o vocabulário poderia evitar inconsistências na representação multi-nível em um cenário real.
5

Metamodeling-based Fast Optimization of Nanoscale Ams-socs

Garitselov, Oleg 05 1900 (has links)
Modern consumer electronic systems are mostly based on analog and digital circuits and are designed as analog/mixed-signal systems on chip (AMS-SoCs). the integration of analog and digital circuits on the same die makes the system cost effective. in AMS-SoCs, analog and mixed-signal portions have not traditionally received much attention due to their complexity. As the fabrication technology advances, the simulation times for AMS-SoC circuits become more complex and take significant amounts of time. the time allocated for the circuit design and optimization creates a need to reduce the simulation time. the time constraints placed on designers are imposed by the ever-shortening time to market and non-recurrent cost of the chip. This dissertation proposes the use of a novel method, called metamodeling, and intelligent optimization algorithms to reduce the design time. Metamodel-based ultra-fast design flows are proposed and investigated. Metamodel creation is a one time process and relies on fast sampling through accurate parasitic-aware simulations. One of the targets of this dissertation is to minimize the sample size while retaining the accuracy of the model. in order to achieve this goal, different statistical sampling techniques are explored and applied to various AMS-SoC circuits. Also, different metamodel functions are explored for their accuracy and application to AMS-SoCs. Several different optimization algorithms are compared for global optimization accuracy and convergence. Three different AMS circuits, ring oscillator, inductor-capacitor voltage-controlled oscillator (LC-VCO) and phase locked loop (PLL) that are present in many AMS-SoC are used in this study for design flow application. Metamodels created in this dissertation provide accuracy with an error of less than 2% from the physical layout simulations. After optimal sampling investigation, metamodel functions and optimization algorithms are ranked in terms of speed and accuracy. Experimental results show that the proposed design flow provides roughly 5,000x speedup over conventional design flows. Thus, this dissertation greatly advances the state-of-the-art in mixed-signal design and will assist towards making consumer electronics cheaper and affordable.
6

A Design Space Exploration Process for Large Scale, Multi-Objective Computer Simulations

Zentner, John Marc 07 July 2006 (has links)
The primary contributions of this thesis are associated with the development of a new method for exploring the relationships between inputs and outputs for large scale computer simulations. Primarily, the proposed design space exploration procedure uses a hierarchical partitioning method to help mitigate the curse of dimensionality often associated with the analysis of large scale systems. Closely coupled with the use of a partitioning approach, is the problem of how to partition the system. This thesis also introduces and discusses a quantitative method developed to aid the user in finding a set of good partitions for creating partitioned metamodels of large scale systems. The new hierarchically partitioned metamodeling scheme, the lumped parameter model (LPM), was developed to address two primary limitations to the current partitioning methods for large scale metamodeling. First the LPM was formulated to negate the need to rely on variable redundancies between partitions to account for potentially important interactions. By using a hierarchical structure, the LPM addresses the impact of neglected, direct interactions by indirectly accounting for these interactions via the interactions that occur between the lumped parameters in intermediate to top-level mappings. Secondly, the LPM was developed to allow for hierarchical modeling of black-box analyses that do not have available intermediaries with which to partition the system around. The second contribution of this thesis is a graph-based partitioning method for large scale, black-box systems. The graph-based partitioning method combines the graph and sparse matrix decomposition methods used by the electrical engineering community with the results of a screening test to create a quantitative method for partitioning large scale, black-box systems. An ANOVA analysis of the results of a screening test can be used to determine the sparse nature of the large scale system. With this information known, the sparse matrix and graph theoretic partitioning schemes can then be used to create potential sets of partitions to use with the lumped parameter model.
7

Driving efficiency in design for rare events using metamodeling and optimization

Morrison, Paul 08 April 2016 (has links)
Rare events have very low probability of occurrence but can have significant impact. Earthquakes, volcanoes, and stock market crashes can have devastating impact on those affected. In industry, engineers evaluate rare events to design better high-reliability systems. The objective of this work is to increase efficiency in design optimization for rare events using metamodeling and variance reduction techniques. Opportunity exists to increase deterministic optimization efficiency by leveraging Design of Experiments to build an accurate metamodel of the system which is less resource intensive to evaluate than the real system. For computationally expensive models, running many trials will impede fast design iteration. Accurate metamodels can be used in place of these expensive models to probabilistically optimize the system for efficient quantification of rare event risk. Monte Carlo is traditionally used for this risk quantification but variance reduction techniques such as importance sampling allow accurate quantification with fewer model evaluations. Metamodel techniques are the thread that tie together deterministic optimization using Design of Experiments and probabilistic optimization using Monte Carlo and variance reduction. This work will explore metamodeling theory and implementation, and outline a framework for efficient deterministic and probabilistic system optimization. The overall conclusion is that deterministic and probabilistic simulation can be combined through metamodeling and used to drive efficiency in design optimization. Applications are demonstrated on a gas turbine combustion autoignition application where user controllable independent variables are optimized in mean and variance to maximize system performance while observing a constraint on allowable probability of a rare autoignition event.
8

Metamodeling for ultra-fast parameter estimation : Theory and evaluation of use in real-time diagnosis of diffuse liver disease

Gollvik, Martin January 2014 (has links)
Diffuse liver disease is a growing problem and a major cause of death worldwide. In the final stages the treatment often involves liver resection or transplant and in deciding what course of action is to be taken it is crucial to have a correct assessment of the function of the liver. The current “gold standard” for this assessment is to take a liver biopsy which has a number of disadvantages. As an alternative, a method involving magnetic resonance imaging and mechanistic modeling of the liver has been developed at Linköping University. One of the obstacles for this method to overcome in order to reach clinical implementation is the speed of the parameter estimation. In this project the methodology of metamodeling is tested as a possible solution to this speed problem. Metamodeling involve making models of models using extensive model simulations and mathematical tools. With the use of regression methods, clustering algorithms, and optimization, different methods for parameter estimation have been evaluated. The results show that several, but not all, of the parameters could be accurately estimated using metamodeling and that metamodeling could be a highly useful tool when modeling biological systems. With further development, metamodeling could bring this non-invasive method for estimation of liver function a major step closer to application in the clinic.
9

Sequential design strategies for mean response surface metamodeling via stochastic kriging with adaptive exploration and exploitation

Chen, Xi, Zhou, Qiang 10 1900 (has links)
Stochastic kriging (SK) methodology has been known as an effective metamodeling tool for approximating a mean response surface implied by a stochastic simulation. In this paper we provide some theoretical results on the predictive performance of SK, in light of which novel integrated mean squared error-based sequential design strategies are proposed to apply SIC for mean response surface metamodeling with a fixed simulation budget. Through numerical examples of different features, we show that SIC with the proposed strategies applied holds great promise for achieving high predictive accuracy by striking a good balance between exploration and exploitation. Published by Elsevier B.V.
10

Non-Deterministic Metamodeling for Multidisciplinary Design Optimization of Aircraft Systems Under Uncertainty

Clark, Daniel L., Jr. 18 December 2019 (has links)
No description available.

Page generated in 0.0829 seconds