• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 14
  • 5
  • 4
  • 2
  • 1
  • Tagged with
  • 44
  • 16
  • 14
  • 11
  • 10
  • 10
  • 9
  • 9
  • 9
  • 9
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Accelerated optimisation methods for low-carbon building design

Tresidder, Esmond January 2014 (has links)
This thesis presents an analysis of the performance of optimisation using Kriging surrogate models on low-carbon building design problems. Their performance is compared with established genetic algorithms operating without a surrogate on a range of different types of building-design problems. The advantages and disadvantages of a Kriging approach, and their particular relevance to low-carbon building design optimisation, are tested and discussed. Scenarios in which Kriging methods are most likely to be of use, and scenarios where, conversely, they may be dis- advantageous compared to other methods for reducing the computational cost of optimisation, such as parallel computing, are highlighted. Kriging is shown to be able, in some cases, to find designs of comparable performance in fewer main-model evaluations than a stand-alone genetic algorithm method. However, this improvement is not robust, and in several cases Kriging required many more main-model evaluations to find comparable designs, especially in the case of design problems with discrete variables, which are common in low-carbon building design. Furthermore, limitations regarding the extent to which Kriging optimisa- tions can be accelerated using parallel computing resources mean that, even in the scenarios in which Kriging showed the greatest advantage, a stand-alone genetic algorithm implemented in parallel would be likely to find comparable designs more quickly. In light of this it is recommended that, for most lowcarbon building design problems, a stand-alone genetic algorithm is the most suitable optimisation method. Two novel methods are developed to improve the performance of optimisation algorithms on low-carbon building design problems. The first takes advantage of variables whose impact can be quickly calculated without re-running an expensive dynamic simulation, in order to dramatically increase the number of designs that can be explored within a given computing budget. The second takes advantage of objectives that can be !Keywords To Be Included For Additional Search Power: Optimisation, optimization, Kriging, meta-models, metamodels, low-energy design ! "2 calculated without a dynamic simulation in order to filter out designs that do not meet constraints in those objectives and focus the use of computationally expensive dynamic simulations on feasible designs. Both of these methods show significant improvement over standard methods in terms of the quality of designs found within a given dynamic-simulation budget.
2

Use of Response Surface Metamodels in Damage Identification of Dynamic Structures

Cundy, Amanda L. 08 January 2003 (has links)
The need for low order models capable of performing damage identification has become apparent in many structural dynamics applications where structural health monitoring and damage prognosis programs are implemented. These programs require that damage identification routines have low computational requirements and be reliable with some quantifiable degree of accuracy. Response surface metamodels (RSMs) are proposed to fill this need. Popular in the fields of chemical and industrial engineering, RSMs have only recently been applied in the field of structural dynamics and to date there have been no studies which fully demonstrate the potential of these methods. In this thesis, several RSMs are developed in order to demonstrate the potential of the methodology. They are shown to be robust to noise (experimental variability) and have success in solving the damage identification problem, both locating and quantifying damage with some degree of accuracy, for both linear and nonlinear systems. A very important characteristic of the RSMs developed in this thesis is that they require very little information about the system in order to generate relationships between damage indicators and measureable system responses for both linear and nonlinear structures. As such, the potential of these methods for damage identification has been demonstrated and it is recommended that these methods be developed further. / Master of Science
3

Disease risk mapping with metamodels for coarse resolution predictors: global potato late blight risk now and under future climate conditions

Sparks, Adam Henry January 1900 (has links)
Doctor of Philosophy / Department of Plant Pathology / Karen A. Garrett / Late blight of potato, caused by Phytophthora infestans, is a pernicious disease of potatoes worldwide. This disease causes yield losses as a result of foliar and tuber damage. Many models exist to predict late blight risk for control purposes with-in season but rely upon fine-scale weather data collected in hourly, or finer, increments. This is a major constraint when working with disease prediction models for areas of the world where hourly weather data is not available or is unreliable. Weather or climate summary datasets are often available as monthly summaries. These provide a partial solution to this problem with global data at large time-steps (e.g., monthly). Difficulties arise when attempting to use these forms of data in small temporal scale models. My first objective was to develop new approaches for application of disease forecast models to coarser resolution weather data sets. I created metamodels based on daily and monthly weather values which adapt an existing potato late blight model for use with these coarser forms of data using generalized additive models. The daily and monthly weather metamodels have R-squared values of 0.62 and 0.78 respectively. These new models were used to map global late blight risk under current and climate change scenarios resistant and susceptible varieties. Changes in global disease risk for locations where wild potato species are indigenous, and disease risk for countries where chronic malnutrition is a problem were evaluated. Under the climate change scenario selected for use, A1B, future global late blight severity decreases. The risk patterns do not show major changes, areas of high risk remain high relative to areas of low risk with rather slight increases or decreases relative to previous years. Areas of higher wild potato species richness experience slightly increased blight risk, while areas of lower species richness experience a slight decline in risk.
4

Construindo ontologias a partir de recursos existentes: uma prova de conceito no domínio da educação. / Building ontologies from existent resources: a proof of concept in education domain.

Cantele, Regina Claudia 07 April 2009 (has links)
Na Grécia antiga, Aristóteles (384-322 aC) reuniu todo conhecimento de sua época para criar a Enciclopédia. Na última década surgiu a Web Semântica representando o conhecimento organizado em ontologias. Na Engenharia de Ontologias, o Aprendizado de Ontologias reúne os processos automáticos ou semi-automáticos de aquisição de conhecimento a partir de recursos existentes. Por outro lado, a Engenharia de Software faz uso de vários padrões para permitir a interoperabilidade entre diferentes ferramentas como os criados pelo Object Management Group (OMG) Model Driven Architecture (MDA), Meta Object Facility (MOF), Ontology Definition Metamodel (ODM) e XML Metadata Interchange (XMI). Já o World Wide Web Consortium (W3C) disponibilizou uma arquitetura em camadas com destaque para a Ontology Web Language (OWL). Este trabalho propõe um framework para reunir estes conceitos fundamentado no ODM, no modelo OWL, na correspondência entre metamodelos, nos requisitos de participação para as ferramentas e na seqüência de atividades a serem aplicadas até obter uma representação inicial da ontologia. Uma prova de conceito no domínio da Educação foi desenvolvida para testar esta proposta. / In ancient Greece, Aristotle (384-322 BCE) endeavored to collect all the existing science in his world to create the Encyclopedia. In the last decade, Berners-Lee and collaborators idealized the Web as a structured repository, observing an organization they called Semantic Web. Usually, domain knowledge is organized in ontologies. As a consequence, a great number of researchers are working on method and technique to build ontologies in Ontology Engineering. Ontology Learning meets automatic or semi-automatic processes which perform knowledge acquisition from existing resources. On the other hand, software engineering uses a collection of theories, methodologies and techniques to support information abstraction and several standards have been used, allowing interoperability and different tools promoted by the Object Management Group (OMG) Model Driven Architecture (MDA), Meta Object Facility (MOF), Ontology Definition Metamodel (ODM) and XML Metadata Interchange (XMI). The World Wide Web Consortium (W3C) released architecture in layers for implementing the Semantic Web with emphasis on the Web Ontology Language (OWL). A framework was developed to combine these concepts based on ODM, on OWL model, the correlation between metamodels, the requirements for the tools to participate; in it, the steps sequence was defined to be applied until initial representations of ontology were obtained. A proof of concept in the Education domain was developed to test this proposal.
5

Réduction du coût de calcul pour la simulation du comportement mécanique de câbles / Reduction of the computational cost for the numerical simulation of the mehcanical behaviour of wire ropes

Otaño Aramendi, Nerea 14 November 2016 (has links)
Le travail présenté dans ce mémoire s'intéresse à la simulation du comportement mécanique de câbles d'ascenseurs. Le but de ce travail est d'élaborer une méthode permettant de simuler le comportement mécanique de tels câbles à moindre coût, et avec une précision suffisante.Dans un premier temps, différentes méthodes permettant de modéliser ou de simuler le comportement de ces câbles ont été comparées, et leurs avantages et inconvénients ont été analysés. Les résultats de modèles analytiques et de simulations éléments finis ont été comparés avec des données expérimentales. Les modèles analytiques considérés dans ce travail présentent un coût de calcul bien moins élevé que les modèles éléments finis, mais n'offrent pas une précision suffisante dans leurs résultats pour simuler le comportement de câbles d'ascenseurs. L'approche éléments finis a été retenue pour cette raison comme la plus adaptée pour simuler ce genre de câbles. Les coûts de calcul liés à cette approche sont cependant très élevés, et demandent la mise en oeuvre de méthodes particulières en vue de les réduire.Afin de réduire les temps de calculs, trois types de méthodes ont été considérées : les méthodes d'homogénéisation, les méta-modèles, et les techniques de réduction de modèle. L'approche de réduction de modèle a été retenue comme la plus appropriée et a été implémentée dans le code de simulation par éléments finis Multifil. Des résultats avec une bonne précision ont été obtenus en utilisant cette méthode, mais les coûts des simulations initiales sur le modèle complet afin d'obtenir un ensemble de solutions permettant de construire une base réduite apparaissent trop élevés dès qu'il s'agit de traiter des câbles de longueurs importantes. Pour remédier à ce problème, une méthode de réduction par tronçon a été formulée et implémentée. Cette méthode tire parti de la structure périodique du câble et permet d'identifier a base de réduction seulement sur un motif périodique élémentaire. Cette base est ensuite utilisée pour représenter la solution sur l'ensemble d'un câble composé de plusieurs tronçons.Le coût des multiplications matricielles nécessaires pour transformer le système linéaire du problème initial, en système linéaire réduit reste cependant trop important pour obtenir un gain significatif, en particulier dans le contexte de la résolution d'un problème non-linéaire. Pour pallier cette difficulté, une technique supplémentaire, appelée ``Discrete Empirical Interpolation Method'' (DEIM), a été mise en oeuvre avec succès, et a permis d'obtenir au final une réduction du coût de calcul d'un facteur 4. / The work presented in this dissertation is focused on the simulation of the mechanical behaviour of lift's wire ropes. The aim of the work is to elaborate a method to simulate the mechanical behaviour of such wire ropes with low computational cost and sufficient accuracy.First of all, several methods to model or simulate wire ropes have been compared and their weak and strong points have been highlighted. Analytical and finite element methods have been compared with experimental tests. It was concluded that analytical methods considered in this work have a lower computational cost than finite element methods, but the results obtained using them are not accurate enough to simulate lift wire ropes. Therefore, finite element methods have been considered as the most appropriate to simulate these wire ropes. However, their computational cost is high so some methods to reduce it must be applied.In order to reduce the computational time, three type of methods have been considered: homogenization, metamodeling and model order reduction. Model order reduction technique was chosen as the most adequate method and it was implemented in the wire rope finite element simulation program Multifil. Accurate results have been obtained, however the computational cost needed by initial simulations to get the snapshots used to define a reduce basis was too high for long wire ropes. To solve this problem, a sectionwise reduction method was proposed and implemented. This formulation takes advantage of the periodic structure of wire ropes: the reduced basis is identified only on a reference elementary section and used for all repetitive sections of a multi-section wire rope. The computational cost induced by the multiplication of matrices in order to transform the linear system of the initial problem into the linear system of the reduced problem was shown to remain too high, particularly in the context of the solving of a non-linear problem, to allow the global computational time to be significantly decreased using the proposed techniques. To overcome this difficulty, an additional technique, namely the so-called Discrete Empirical Interpolation Method (DEIM) was successfully implemented and tested, allowing a time reduction factor of 4 to be obtained.
6

Stratégie multiparamétrique et métamodèles pour l'optimisation multiniveaux de structures / Multilevel optimisation of structures using a multiparametric strategy and metamodels

Laurent, Luc 02 December 2013 (has links)
Bien que de plus en plus employée au sein des bureaux d'études dans le cadre de la conception mécanique, l'optimisation reste à l'heure actuelle encore relativement peu utilisée dans le cadre des assemblages de structures. La résolution mécanique de ce type de problèmes nécessite la mise en oeuvre de méthodes numériques de résolution capables de prendre en compte des non-linéarités de type frottement et contact. Ces méthodes sont, en raison du temps de calcul, généralement incompatibles avec la recherche d'un optimum global nécessitant un nombre important de résolutions. C'est pour pallier à ce problème que ce travail propose l'emploi d'une approche d'optimisation bi-niveaux de modèles faisant intervenir deux outils: (1) la stratégie multiparamétrique, basée sur la méthode LaTIn, assure d'importantes réductions des temps de calcul associés aux multiples résolutions du problème mécanique et (2) un métamodèle de type cokrigeage construit à partir d'un nombre limité de réponses et gradients calculés par le solveur mécanique. Ce métamodèle est alors capable de fournir à coût extrêmement faible des réponses approchées de la fonction objectif. Une optimisation globale est ensuite réalisée sur ce dernier, assurant l'obtention d'un optimum global. Le métamodèle de cokrigeage est étudié en détail sur des exemples analytiques et mécaniques comportant divers nombres de paramètres. Par ailleurs, une étude complète de l'emploi de la stratégie multiparamétrique est proposée et de nombreux exemples d'assemblages sont considérés, permettant ainsi d'illustrer les performances significatives de la procédure d'optimisation proposée. / Optimisation strategies on assembly design are often relatively time expensive because of the large number of non-linear calculations (due to contact or friction problems) required to localize the optimum of an objective function. In order to achieve this kind of optimization problems with an acceptable computational time, this work propose to use a two-levels model optimization strategy based on two main tools: (1) the multiparametric strategy based on the LaTIn method that enables to reduce significantly the computational time for solving many similar mechanical assembly problems and (2) a cokriging metamodel built using responses and gradients computed by the mechanical solver on few sets of design parameters. The metamodel provides very inexpensive approximate responses of the objective function and it enables to achieve a global optimisation and to obtain the global optimum. The cokriging metamodel was reviewed in detail using analytical test functions and some mechanical benchmarks. The quality of the approximation and the building cost were compared with classical kriging approach. Moreover, a complete study of the multiparametric strategy was proposed using many mechanical benchmarks included many kinds and numbers of design parameters. The performance in term of computational time of the whole optimisation process was illustrated.
7

Construindo ontologias a partir de recursos existentes: uma prova de conceito no domínio da educação. / Building ontologies from existent resources: a proof of concept in education domain.

Regina Claudia Cantele 07 April 2009 (has links)
Na Grécia antiga, Aristóteles (384-322 aC) reuniu todo conhecimento de sua época para criar a Enciclopédia. Na última década surgiu a Web Semântica representando o conhecimento organizado em ontologias. Na Engenharia de Ontologias, o Aprendizado de Ontologias reúne os processos automáticos ou semi-automáticos de aquisição de conhecimento a partir de recursos existentes. Por outro lado, a Engenharia de Software faz uso de vários padrões para permitir a interoperabilidade entre diferentes ferramentas como os criados pelo Object Management Group (OMG) Model Driven Architecture (MDA), Meta Object Facility (MOF), Ontology Definition Metamodel (ODM) e XML Metadata Interchange (XMI). Já o World Wide Web Consortium (W3C) disponibilizou uma arquitetura em camadas com destaque para a Ontology Web Language (OWL). Este trabalho propõe um framework para reunir estes conceitos fundamentado no ODM, no modelo OWL, na correspondência entre metamodelos, nos requisitos de participação para as ferramentas e na seqüência de atividades a serem aplicadas até obter uma representação inicial da ontologia. Uma prova de conceito no domínio da Educação foi desenvolvida para testar esta proposta. / In ancient Greece, Aristotle (384-322 BCE) endeavored to collect all the existing science in his world to create the Encyclopedia. In the last decade, Berners-Lee and collaborators idealized the Web as a structured repository, observing an organization they called Semantic Web. Usually, domain knowledge is organized in ontologies. As a consequence, a great number of researchers are working on method and technique to build ontologies in Ontology Engineering. Ontology Learning meets automatic or semi-automatic processes which perform knowledge acquisition from existing resources. On the other hand, software engineering uses a collection of theories, methodologies and techniques to support information abstraction and several standards have been used, allowing interoperability and different tools promoted by the Object Management Group (OMG) Model Driven Architecture (MDA), Meta Object Facility (MOF), Ontology Definition Metamodel (ODM) and XML Metadata Interchange (XMI). The World Wide Web Consortium (W3C) released architecture in layers for implementing the Semantic Web with emphasis on the Web Ontology Language (OWL). A framework was developed to combine these concepts based on ODM, on OWL model, the correlation between metamodels, the requirements for the tools to participate; in it, the steps sequence was defined to be applied until initial representations of ontology were obtained. A proof of concept in the Education domain was developed to test this proposal.
8

Metamodel-Based Probabilistic Design for Dynamic Systems with Degrading Components

Seecharan, Turuna Saraswati January 2012 (has links)
The probabilistic design of dynamic systems with degrading components is difficult. Design of dynamic systems typically involves the optimization of a time-invariant performance measure, such as Energy, that is estimated using a dynamic response, such as angular speed. The mechanistic models developed to approximate this performance measure are too complicated to be used with simple design calculations and lead to lengthy simulations. When degradation of the components is assumed, in order to determine suitable service times, estimation of the failure probability over the product lifetime is required. Again, complex mechanistic models lead to lengthy lifetime simulations when the Monte Carlo method is used to evaluate probability. Based on these problems, an efficient methodology is presented for probabilistic design of dynamic systems and to estimate the cumulative distribution function of the time to failure of a performance measure when degradation of the components is assumed. The four main steps include; 1) transforming the dynamic response into a set of static responses at discrete cycle-time steps and using Singular Value Decomposition to efficiently estimate a time-invariant performance measure that is based upon a dynamic response, 2) replacing the mechanistic model with an approximating function, known as a “metamodel” 3) searching for the best design parameters using fast integration methods such as the First Order Reliability Method and 4) building the cumulative distribution function using the summation of the incremental failure probabilities, that are estimated using the set-theory method, over the planned lifetime. The first step of the methodology uses design of experiments or sampling techniques to select a sample of training sets of the design variables. These training sets are then input to the computer-based simulation of the mechanistic model to produce a matrix of corresponding responses at discrete cycle-times. Although metamodels can be built at each time-specific column of this matrix, this method is slow especially if the number of time steps is large. An efficient alternative uses Singular Value Decomposition to split the response matrix into two matrices containing only design-variable-specific and time-specific information. The second step of the methodology fits metamodels only for the significant columns of the matrix containing the design variable-specific information. Using the time-specific matrix, a metamodel is quickly developed at any cycle-time step or for any time-invariant performance measure such as energy consumed over the cycle-lifetime. In the third step, design variables are treated as random variables and the First Order Reliability Method is used to search for the best design parameters. Finally, the components most likely to degrade are modelled using either a degradation path or a marginal distribution model and, using the First Order Reliability Method or a Monte Carlo Simulation to estimate probability, the cumulative failure probability is plotted. The speed and accuracy of the methodology using three metamodels, the Regression model, Kriging and the Radial Basis Function, is investigated. This thesis shows that the metamodel offers a significantly faster and accurate alternative to using mechanistic models for both probabilistic design optimization and for estimating the cumulative distribution function. For design using the First-Order Reliability Method to estimate probability, the Regression Model is the fastest and the Radial Basis Function is the slowest. Kriging is shown to be accurate and faster than the Radial Basis Function but its computation time is still slower than the Regression Model. When estimating the cumulative distribution function, metamodels are more than 100 times faster than the mechanistic model and the error is less than ten percent when compared with the mechanistic model. Kriging and the Radial Basis Function are more accurate than the Regression Model and computation time is faster using the Monte Carlo Simulation to estimate probability than using the First-Order Reliability Method.
9

Building Seismic Fragilities Using Response Surface Metamodels

Towashiraporn, Peeranan 20 August 2004 (has links)
Building fragility describes the likelihood of damage to a building due to random ground motions. Conventional methods for computing building fragilities are either based on statistical extrapolation of detailed analyses on one or two specific buildings or make use of Monte Carlo simulation with these models. However, the Monte Carlo technique usually requires a relatively large number of simulations in order to obtain a sufficiently reliable estimate of the fragilities, and it quickly becomes impractical to simulate the required thousands of dynamic time-history structural analyses for physics-based analytical models. An alternative approach for carrying out the structural simulation is explored in this work. The use of Response Surface Methodology in connection with the Monte Carlo simulations simplifies the process of fragility computation. More specifically, a response surface is sought to predict the structural response calculated from complex dynamic analyses. Computational cost required in a Monte Carlo simulation will be significantly reduced since the simulation is performed on a polynomial response surface function, rather than a complex dynamic model. The methodology is applied to the fragility computation of an unreinforced masonry (URM) building located in the New Madrid Seismic Zone. Different rehabilitation schemes for this structure are proposed and evaluated through fragility curves. Response surface equations for predicting peak drift are generated and used in the Monte Carlo simulation. Resulting fragility curves show that the URM building is less likely to be damaged from future earthquakes when rehabilitation is properly incorporated. The thesis concludes with a discussion of an extension of the methodology to the problem of computing fragilities for a collection of buildings of interest. Previous approaches have considered uncertainties in material properties, but this research incorporates building parameters such as geometry, stiffness, and strength variabilities as well as nonstructural parameters (age, design code) over an aggregation of buildings in the response surface models. Simulation on the response surface yields the likelihood of damage to a group of buildings under various earthquake intensity levels. This aspect is of interest to governmental agencies or building owners who are responsible for planning proper mitigation measures for collections of buildings.
10

Feedback control of gas metal arc braze-welding using thermal signals

Shah, Sanjiv Edlagan 26 October 2011 (has links)
In serial manufacturing processes, localized energy sources (e.g. plasma cutters, arc welders or water jets) induce material geometry transformations that yield a desired product. Simple parameter control of these energy sources does not necessarily ensure an optimal or successful part because of disturbances in the manufacturing process (material and temperature variations, etc). Currently, control in manufacturing is based on statistical process control where large databases for the manufacturing of a fixed process are available and have been compiled over several manufacturing runs. In the absence of a statistical database, and with the increased need for improved monitoring and throughput, there is need for active process control in manufacturing. In this work, Gas Metal Arc Braze-Welding (GMABW) will serve as a test-bed for the implementation of model predictive control (MPC) for a serial manufacturing process. This dissertation investigates the integration of real time modeling of the temperature field with control algorithms to control the evolving temperature field in the ix braze-welded base metal. Fundamental problems involving MPC that are addressed are modeling techniques to calculate temperature fields with reduced computational requirements and control algorithms that utilize the thermal models directly to inform the controller. The dissertation first outlines and compares analytical and computational thermal models and comparison with experimental data are obtained. A thermal model based on a metamodeling approach is used as the plant model for a classical control system and control parameters are found. Various techniques for dealing with signal noise encountered during experimentation are investigated. A proportional controller is implemented in the experimental setup that applies feedback control of the braze –welding process using thermal signals. A novel approach to MPC is explored by using a metamodel as the plant model for the braze-welding process and having the temperature trajectory dictated by the metamodel in the steady state region of the weld. Lastly, future work and extensions of this research are outlined. / text

Page generated in 0.033 seconds