• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 7
  • 5
  • 1
  • 1
  • Tagged with
  • 31
  • 31
  • 8
  • 7
  • 7
  • 7
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Meta Modelle - Neue Planungswerkzeuge für Materialflußsysteme

Schulze, Frank January 1999 (has links)
Meta-Modelle sind Rechenmodelle, die das Verhalten technischer Systeme näherungsweise beschreiben oder nachbilden. Sie werden aus Beobachtungen von Simulationsmodellen der technischen Systeme abgeleitet. Es handelt sich also um Modelle von Modellen, um Meta-Modelle. Meta-Modelle unterscheiden sich grundsätzlich von analytischen Ansätzen zur Systembeschreibung. Während analytische Ansätze in ihrer mathematischen Struktur die tatsächlichen Gegebenheiten des betrachteten Systems wiedergeben, sind Meta-Modelle stets Näherungen. Der Vorteil von Meta-Modellen liegt in ihrer einfachen Form. Sie sind leicht zu bilden und anzuwenden. Ihr Nachteil ist die nur annähernde und u.U. unvollständige Beschreibung des Systemverhaltens. Im folgenden wird die Bildung von Meta-Modellen anhand eines Bediensystems dargestellt. Zuerst werden die Möglichkeiten einer analytischen Beschreibung bewertet. Danach werden zwei unterschiedliche Meta-Modelle, Polynome und neuronale Netze, vorgestellt. Möglichkeiten und Grenzen dieser Formen der Darstellung des Systemverhaltens werden diskutiert. Abschließend werden praktische Einsatzfelder von Meta-Modellen in der Materialflußplanung und -simulation aufgezeigt.
12

Machine Learning from Computer Simulations with Applications in Rail Vehicle Dynamics and System Identification

Taheri, Mehdi 01 July 2016 (has links)
The application of stochastic modeling for learning the behavior of multibody dynamics models is investigated. The stochastic modeling technique is also known as Kriging or random function approach. Post-processing data from a simulation run is used to train the stochastic model that estimates the relationship between model inputs, such as the suspension relative displacement and velocity, and the output, for example, sum of suspension forces. Computational efficiency of Multibody Dynamics (MBD) models can be improved by replacing their computationally-intensive subsystems with stochastic predictions. The stochastic modeling technique is able to learn the behavior of a physical system and integrate its behavior in MBS models, resulting in improved real-time simulations and reduced computational effort in models with repeated substructures (for example, modeling a train with a large number of rail vehicles). Since the sampling plan greatly influences the overall accuracy and efficiency of the stochastic predictions, various sampling plans are investigated, and a space-filling Latin Hypercube sampling plan based on the traveling salesman problem (TPS) is suggested for efficiently representing the entire parameter space. The simulation results confirm the expected increased modeling efficiency, although further research is needed for improving the accuracy of the predictions. The prediction accuracy is expected to improve through employing a sampling strategy that considers the discrete nature of the training data and uses infill criteria that considers the shape of the output function and detects sample spaces with high prediction errors. It is recommended that future efforts consider quantifying the computation efficiency of the proposed learning behavior by overcoming the inefficiencies associated with transferring data between multiple software packages, which proved to be a limiting factor in this study. These limitations can be overcome by using the user subroutine functionality of SIMPACK and adding the stochastic modeling technique to its force library. / Ph. D.
13

Metamodeling for Business Model Design : Facilitating development and communication of Business Model Canvas (BMC) models with an OMG standards-based metamodel.

Hauksson, Hilmar January 2013 (has links)
Interest for business models and business modeling has increased rapidly since the mid-1990‘s and there are numerous approaches used to create business models. The business model concept has many definitions which can lead to confusion and slower progress in the research and development of business models. A business model ontology (BMO) was created in 2004 where the business model concept was conceptualized based on an analysis of existing literature. A few years later the Business Model Canvas (BMC) was published; a popular business modeling approach providing a high-level, semi-formal approach to create and communicate business models. While this approach is easy to use, the informality and high-level approach can cause ambiguity and it has limited computer-aided support available. In order to propose a solution to address this problem and facilitate the development and communication of Business Model Canvas models, two artifacts are created, demonstrated and evaluated; a structured metamodel for the Business Model Canvas and its implementation in an OMG standards-based modeling tool to provide tool support for BMC modeling.This research is carried out following the design science approach where the artifacts are created to better understand and improve the identified problem. The problem and its background are explicated and the planned artifacts and requirements are outlined. The design and development of the artifacts are detailed and the resulting BMC metamodel is presented as a class diagram in Unified Modeling Language (UML) and implemented to provide tool support for BMC modeling. A demonstration with a business model and an evaluation is performed with expert interviews and informed arguments.The creation of a BMC metamodel exposed some ambiguity in the definition and use of the Business Model Canvas and the importance of graphical presentation and flexibility in the tools used.The evaluation of the resulting artifacts suggests that the artifacts do facilitate the development and communication of the Business Model Canvas models by improving the encapsulation and communication of information in a standardized way and thereby the goals of the research are met.
14

Meta-modelagem em confiabilidade estrutural / Meta-modeling techniques in structural reliability

Kroetz, Henrique Machado 23 March 2015 (has links)
A aplicação de simulações numéricas em problemas de confiabilidade estrutural costuma estar associada a grandes custos computacionais, dada a pequena probabilidade de falha inerente às estruturas. Ainda que diversos casos possam ser endereçados através de técnicas de redução da variância das amostras, a solução de problemas envolvendo grande número de graus de liberdade, respostas dinâmicas, não lineares e problemas de otimização na presença de incertezas são comumente ainda inviáveis de se resolver por esta abordagem. Tais problemas, porém, podem ser resolvidos através de representações analíticas que aproximam a resposta que seria obtida com a utilização de modelos computacionais mais complexos, chamadas chamados meta-modelos. O presente trabalho trata da compilação, assimilação, programação em computador e comparação de técnicas modernas de meta-modelagem no contexto da confiabilidade estrutural, utilizando representações construídas a partir de redes neurais artificiais, expansões em polinômios de caos e através de krigagem. Estas técnicas foram implementadas no programa computacional StRAnD - Structural Reliability Analysis and Design, desenvolvido junto ao Departamento de Engenharia de Estruturas, USP, resultando assim em um benefício permanente para a análise de confiabilidade estrutural junto à Universidade de São Paulo. / The application of numerical simulations to structural reliability problems is often associated with high computational costs, given the small probability of failure inherent to the structures. Although many cases can be addressed using variance reduction techniques, solving problems involving large number of degrees of freedom, nonlinear and dynamic responses, and problems of optimization in the presence of uncertainties are sometimes still infeasible to solve by this approach. Such problems, however, can be solved by analytical representations that approximate the response that would be obtained with the use of more complex computational models, called meta-models. This work deals with the collection, assimilation, computer programming and comparison of modern meta-modeling techniques in the context of structural reliability, using representations constructed from artificial neural networks, polynomial chaos expansions and Kriging. These techniques are implemented in the computer program StRAnD - Structural Reliability Analysis and Design, developed at the Department of Structural Engineering, USP; thus resulting in a permanent benefit to structural reliability analysis at the University of São Paulo.
15

Especifica??o e implementa??o de uma linguagem para transforma??o de modelos MOF em reposit?rios dMOF

Lins, Hertz Wilton de Castro 06 October 2006 (has links)
Made available in DSpace on 2014-12-17T14:56:21Z (GMT). No. of bitstreams: 1 HertzWCL.pdf: 1101364 bytes, checksum: 81e217f795557edb8c6aa671cd62dabf (MD5) Previous issue date: 2006-10-06 / Coordena??o de Aperfei?oamento de Pessoal de N?vel Superior / This work presents the specification and the implementation of a language of Transformations in definite Models specification MOF (Meta Object Facility) of OMG (Object Management Group). The specification uses a boarding based on rules ECA (Event-Condition-Action) and was made on the basis of a set of scenes of use previously defined. The Parser Responsible parser for guaranteeing that the syntactic structure of the language is correct was constructed with the tool JavaCC (Java Compiler Compiler) and the description of the syntax of the language was made with EBNF (Extended Backus-Naur Form). The implementation is divided in three parts: the creation of the interpretative program properly said in Java, the creation of an executor of the actions specified in the language and its integration with the type of considered repository (generated for tool DSTC dMOF). A final prototype was developed and tested in the scenes previously defined / Este trabalho apresenta a especifica??o e a implementa??o de uma linguagem de Transforma??es em Modelos definidos segundo a especifica??o MOF (Meta Object Facility) da OMG (Object Management Group). A especifica??o utiliza uma abordagem baseada em regras ECA (Event-Condition-Action) e foi feita com base em um conjunto de cen?rios de uso previamente definidos. O parser respons?vel por garantir que a estrutura sint?tica da linguagem est? correta foi constru?do com a ferramenta JavaCC (Java Compiler Compiler) e a descri??o da sintaxe da linguagem foi feita com EBNF (Extended Backus-Naur Form). A implementa??o est? dividida em tr?s partes: a cria??o do programa interpretador propriamente dito em Java, a cria??o de um executor das a??es especificadas na linguagem e sua integra??o com o tipo de reposit?rio considerado (gerados pela ferramenta DSTC dMOF). Um prot?tipo final foi desenvolvido e testado nos cen?rios previamente definidos
16

Meta-modelagem em confiabilidade estrutural / Meta-modeling techniques in structural reliability

Henrique Machado Kroetz 23 March 2015 (has links)
A aplicação de simulações numéricas em problemas de confiabilidade estrutural costuma estar associada a grandes custos computacionais, dada a pequena probabilidade de falha inerente às estruturas. Ainda que diversos casos possam ser endereçados através de técnicas de redução da variância das amostras, a solução de problemas envolvendo grande número de graus de liberdade, respostas dinâmicas, não lineares e problemas de otimização na presença de incertezas são comumente ainda inviáveis de se resolver por esta abordagem. Tais problemas, porém, podem ser resolvidos através de representações analíticas que aproximam a resposta que seria obtida com a utilização de modelos computacionais mais complexos, chamadas chamados meta-modelos. O presente trabalho trata da compilação, assimilação, programação em computador e comparação de técnicas modernas de meta-modelagem no contexto da confiabilidade estrutural, utilizando representações construídas a partir de redes neurais artificiais, expansões em polinômios de caos e através de krigagem. Estas técnicas foram implementadas no programa computacional StRAnD - Structural Reliability Analysis and Design, desenvolvido junto ao Departamento de Engenharia de Estruturas, USP, resultando assim em um benefício permanente para a análise de confiabilidade estrutural junto à Universidade de São Paulo. / The application of numerical simulations to structural reliability problems is often associated with high computational costs, given the small probability of failure inherent to the structures. Although many cases can be addressed using variance reduction techniques, solving problems involving large number of degrees of freedom, nonlinear and dynamic responses, and problems of optimization in the presence of uncertainties are sometimes still infeasible to solve by this approach. Such problems, however, can be solved by analytical representations that approximate the response that would be obtained with the use of more complex computational models, called meta-models. This work deals with the collection, assimilation, computer programming and comparison of modern meta-modeling techniques in the context of structural reliability, using representations constructed from artificial neural networks, polynomial chaos expansions and Kriging. These techniques are implemented in the computer program StRAnD - Structural Reliability Analysis and Design, developed at the Department of Structural Engineering, USP; thus resulting in a permanent benefit to structural reliability analysis at the University of São Paulo.
17

Utilisation de méta-modèles multi-fidélité pour l'optimisation de la production des réservoirs / Use of multi-fidelity meta-models for optimizing reservoir production

Thenon, Arthur 20 March 2017 (has links)
Les simulations d'écoulement sur des modèles représentatifs d'un gisement pétrolier sont généralement coûteuses en temps de calcul. Une pratique courante en ingénierie de réservoir consiste à remplacer ces simulations par une approximation mathématique, un méta-modèle. La méta-modélisation peut fortement réduire le nombre de simulations nécessaires à l'analyse de sensibilité, le calibrage du modèle, l'estimation de la production, puis son optimisation. Cette thèse porte sur l'étude de méta-modèles utilisant des simulations réalisées à différents niveaux de précision, par exemple pour des modèles de réservoir avec des maillages de résolutions différentes. L'objectif est d'accélérer la construction d'un méta-modèle prédictif en combinant des simulations coûteuses avec des simulations rapides mais moins précises. Ces méta-modèles multi-fidélité, basés sur le co-krigeage, sont comparés au krigeage pour l'approximation de sorties de la simulation d'écoulement. Une analyse en composantes principales peut être considérée afin de réduire le nombre de modèles de krigeage pour la méta-modélisation de réponses dynamiques et de cartes de propriétés. Cette méthode peut aussi être utilisée pour améliorer la méta-modélisation de la fonction objectif dans le cadre du calage d'historique. Des algorithmes de planification séquentielle d'expériences sont finalement proposés pour accélérer la méta-modélisation et tirer profit d'une approche multi-fidélité. Les différentes méthodes introduites sont testées sur deux cas synthétiques inspirés des benchmarks PUNQ-S3 et Brugge. / Performing flow simulations on numerical models representative of oil deposits is usually a time consuming task in reservoir engineering. The substitution of a meta-model, a mathematical approximation, for the flow simulator is thus a common practice to reduce the number of calls to the flow simulator. It permits to consider applications such as sensitivity analysis, history-matching, production estimation and optimization. This thesis is about the study of meta-models able to integrate simulations performed at different levels of accuracy, for instance on reservoir models with various grid resolutions. The goal is to speed up the building of a predictive meta-model by balancing few expensive but accurate simulations, with numerous cheap but approximated ones. Multi-fidelity meta-models, based on co-kriging, are thus compared to kriging meta-models for approximating different flow simulation outputs. To deal with vectorial outputs without building a meta-model for each component of the vector, the outputs can be split on a reduced basis using principal component analysis. Only a few meta-models are then needed to approximate the main coefficients in the new basis. An extension of this approach to the multi-fidelity context is proposed. In addition, it can provide an efficient meta-modelling of the objective function when used to approximate each production response involved in the objective function definition. The proposed methods are tested on two synthetic cases derived from the PUNQ-S3 and Brugge benchmark cases. Finally, sequential design algorithms are introduced to speed-up the meta-modeling process and exploit the multi-fidelity approach.
18

Distribution system meta-models in an electronic commerce environment

Ko, Hung-Tse January 2001 (has links)
No description available.
19

Optimisation de placement des puits / Well placement optimization

Bouzarkouna, Zyed 03 April 2012 (has links)
La quantité d’hydrocarbures récupérés peut être considérablement augmentée si un placement optimal des puits non conventionnels à forer, peut être trouvé. Pour cela, l’utilisation d’algorithmes d’optimisation, où la fonction objectif est évaluée en utilisant un simulateur de réservoir, est nécessaire. Par ailleurs, pour des réservoirs avec une géologie complexe avec des hétérogénéités élevées, le problème d’optimisation nécessite des algorithmes capables de faire face à la non-régularité de la fonction objectif. L’objectif de cette thèse est de développer une méthodologie efficace pour déterminer l’emplacement optimal des puits et leurs trajectoires, qui offre la valeur liquidative maximale en utilisant un nombre techniquement abordable de simulations de réservoir.Dans cette thèse, nous montrons une application réussie de l’algorithme “Covariance Matrix Adaptation - Evolution Strategy” (CMA-ES) qui est reconnu comme l’un des plus puissants optimiseurs sans-dérivés pour l’optimisation continue. Par ailleurs, afin de réduire le nombre de simulations de réservoir (évaluations de la fonction objectif), nous concevons deux nouveaux algorithmes. Premièrement, nous proposons une nouvelle variante de la méthode CMA-ES avec des méta-modèles, appelé le nouveau-local-méta-modèle CMA-ES (nlmm-CMA), améliorant la variante déjà existante de la méthode local-méta-modèle CMA-ES (lmm-CMA) sur la plupart des fonctions de benchmark, en particulier pour des tailles de population plus grande que celle par défaut. Ensuite, nous proposons d’exploiter la séparabilité partielle de la fonction objectif durant le processus d’optimisation afin de définir un nouvel algorithme appelé la partiellement séparable local-méta-modèle CMAES (p-sep lmm-CMA), conduisant à une réduction importante en nombre d’évaluations par rapport à la méthode CMA-ES standard.Dans cette thèse, nous appliquons également les algorithmes développés (nlmm-CMA et p-sep lmm-CMA) sur le problème de placement des puits pour montrer, à travers plusieurs exemples, une réduction significative du nombre de simulations de réservoir nécessaire pour trouver la configuration optimale des puits. Les approches proposées sont révélées prometteuses en considérant un budget restreint de simulations de réservoir, qui est le contexte imposé dans la pratique.Enfin, nous proposons une nouvelle approche pour gérer l’incertitude géologique pour le problème d’optimisation de placement des puits. L’approche proposée utilise seulement une réalisation, ainsi que le voisinage de chaque configuration, afin d’estimer sa fonction objectif au lieu d’utiliser multiples réalisations. L’approche est illustrée sur un cas de réservoir de benchmark, et se révèle être en mesure de capturer l’incertitude géologique en utilisant un nombre réduit de simulations de réservoir. / The amount of hydrocarbon recovered can be considerably increased by finding optimal placement of non-conventional wells. For that purpose, the use of optimization algorithms, where the objective function is evaluated using a reservoir simulator, is needed. Furthermore, for complex reservoir geologies with high heterogeneities, the optimization problem requires algorithms able to cope with the non-regularity of the objective function. The goal of this thesis was to develop an efficient methodology for determining optimal well locations and trajectories, that offers the maximum asset value using a technically feasible number of reservoir simulations.In this thesis, we show a successful application of the Covariance Matrix Adaptation - Evolution Strategy (CMA-ES) which is recognized as one of the most powerful derivative-free optimizers for continuous optimization. Furthermore, in order to reduce the number of reservoir simulations (objective function evaluations), we design two new algorithms. First, we propose a new variant of CMA-ES with meta-models, called the newlocal-meta-model CMA-ES (nlmm-CMA), improving over the already existing variant of the local-meta-model CMA-ES (lmm-CMA) on most benchmark functions, in particular for population sizes larger than the default one. Then, we propose to exploit the partial separability of the objective function in the optimization process to define a new algorithm called the partially separable local-meta-model CMA-ES (p-sep lmm-CMA), leading to an important speedup compared to the standard CMA-ES.In this thesis, we apply also the developed algorithms (nlmm-CMA and p-sep lmm-CMA) on the well placement problem to show, through several examples, a significant reduction of the number of reservoir simulations needed to find optimal well configurations. The proposed approaches are shown to be promising when considering a restricted budget of reservoir simulations, which is the imposed context in practice.Finally, we propose a new approach to handle geological uncertainty for the well placement optimization problem. The proposed approach uses only one realization together with the neighborhood of each well configuration in order to estimate its objective function instead of using multiple realizations. The approach is illustrated on a synthetic benchmark reservoir case, and is shown to be able to capture the geological uncertainty using a reduced number of reservoir simulations.
20

Une approche déclarative pour la génération de modèles / A Declarative Approach for Model Generation

Ferdjoukh, Adel 20 October 2016 (has links)
Disposer de données dans le but de valider ou tester une approche ou un concept est d'une importance primordiale dans beaucoup de domaines différents. Malheureusement, ces données ne sont pas toujours disponibles, sont coûteuses à obtenir, ou bien ne répondent pas à certaines exigences de qualité ce qui les rend inutiles dans certains cas de figure.Un générateur automatique de données est un bon moyen pour obtenir facilement et rapidement des données valides, de différentes tailles, pertinentes et diversifiées. Dans cette thèse, nous proposons une nouvelle approche complète, dirigée par les modèles et basée sur la programmation par contraintes pour la génération de données. / Owning data is useful in many different fields. Data can be used to test and to validate approaches, algorithms and concepts. Unfortunately, data is rarely available, is cost to obtain, or is not adapted to most of cases due to a lack of quality.An automated data generator is a good way to generate quickly and easily data that are valid, in different sizes, likelihood and diverse.In this thesis, we propose a novel and complete model driven approach, based on constraint programming for automated data generation.

Page generated in 0.0648 seconds