91 |
Checagem de conformidade arquitetural na modernização orientada a arquiteturaChagas, Fernando Bezerra 03 March 2016 (has links)
Submitted by Alison Vanceto (alison-vanceto@hotmail.com) on 2017-01-06T12:32:32Z
No. of bitstreams: 1
DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5) / Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2017-01-16T12:01:27Z (GMT) No. of bitstreams: 1
DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5) / Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2017-01-16T12:01:35Z (GMT) No. of bitstreams: 1
DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5) / Made available in DSpace on 2017-01-16T12:01:44Z (GMT). No. of bitstreams: 1
DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5)
Previous issue date: 2016-03-03 / Não recebi financiamento / Architecture-Driven Modernization (ADM) is a model-based initiative for standardizing
reengineering processes. Its most important meta-model is KDM (Knowledge Discovery
Metamodel), which is a platform and language-independent ISO standard. A important
step in an Architecture-Driven Modernization is the Architectural Conformance Checking
(ACC), whose goal is to identify the violations between the Planned (PA) and Current Architectures (CA) of a system. Although there are ACC approaches that act on source-code
or proprietary models, there is none for hystems represented as KDM. This absence hinders the dissemination of ADM and increases the interest for research that investigates the suitability of KDM in this context. Therefore, in this paper, we present ArchKDM, a KDMbased ACC approach that relies exclusively on the KDM meta-model for representing i) the legacy system under analysis; ii) the PA; iii) the CA; and iv) the violations between them. ArchKDM is composed of three tool-supported steps: 1) Specifying the Planned Architecture;
2) Extracting the Current Architecture; and 3) Performing the Checking. Our goal is to
investigate the suitability of KDM as the main representation in all ACC steps as well as to
deliver an ACC approach in the ADM context. We evaluated steps 2 and 3 of the approach
using two real-world systems and the results showed no false positives and negatives. / Modernização Dirigida por Modelos (ADM) é uma iniciativa para a padronização dos processos de reengenharia. Dentre os metamodelos criados pela ADM, o mais importante é chamado de KDM (Metamodelo de Descoberta de Conhecimento), que é independente de plataforma e linguagem, além de ser padrão ISO. Uma importante etapa em uma Modernização Dirigida por Modelos é a Checagem de Conformidade Arquitetural (ACC), cujo objetivo é identificar violações entre as representações das arquiteturas planejada e atual de um sistema. Embora existam abordagens para ACC que atuam sobre código-fonte e modelos proprietários, não foram encontrados indícios desse tipo de abordagem para sistemas representados em KDM. Essa ausência de pesquisas na área dificulta a disseminação da ADM e aumenta o interesse em investigar a adequabilidade do KDM nesse contexto. Portanto, neste trabalho é apresentado o ArchKDM, uma abordagem para ACC baseado em KDM que depende exclusivamente do metamodelo KDM para representação i) do sistema legado a ser analisado; ii) da arquitetura planejada; iii) da arquitetura atual; e iv) das violações
encontradas entre eles. ArchKDM é composta por três etapas: 1) Especificação da
Arquitetura Planejada; 2) Extração da Arquitetura Atual; e 3) Checagem de Conformidade Arquitetural. O objetivo deste trabalho é investigar a adequabilidade do KDM como principal representação em todas as etapas da ACC, bem como fornecer uma abordagem para ACC no contexto da ADM. A abordagem foi avaliada utilizando dois sistemas reais e os resultados mostraram que não foram encontrados falsos positivos e negativos.
|
92 |
UMA ABORDAGEM BASEADA NA ENGENHARIA DIRIGIDA POR MODELOS PARA SUPORTAR MERGING DE BASE DE DADOS HETEROGÊNEAS / AN APPROACH BASED IN MODEL DRIVEN ENGINEERING TO SUPPORT MERGING OF HETEROGENEOUS DATABASECARVALHO, Marcus Vinícius Ribeiro de 24 February 2014 (has links)
Made available in DSpace on 2016-08-17T14:53:26Z (GMT). No. of bitstreams: 1
Dissertacao Marcus Vinicius Ribeiro.pdf: 4694533 bytes, checksum: b84a4bad63b098d054781131cfb9bc26 (MD5)
Previous issue date: 2014-02-24 / Model Driven Engineering (MDE) aims to make face to the development, maintenance and evolution of complex software systems, focusing in models and model transformations.
This approach can be applied in other domains such as database schema integration. In this research work, we propose a framework to integrate database schema in the MDE context. Metamodels for defining database model, database model matching, database model merging, and integrated database model are proposed in order to support our framework. An algorithm for database model matching and an algorithm
for database model merging are presented. We present also, a prototype that extends the MT4MDE and SAMT4MDE tools in order to demonstrate the implementation of our proposed framework, metodology, and algorithms. An illustrative example helps
to understand our proposed framework. / A Engenharia Dirigida por Modelos (MDE) fornece suporte para o gerenciamento da complexidade de desenvolvimento, manutenção e evolução de software, através da criação e transformação de modelos. Esta abordagem pode ser utilizada em outros domínios também complexos como a integração de esquemas de base de dados. Neste trabalho de pesquisa, propomos uma metodologia para integrar schema de base de dados no contexto da MDE. Metamodelos para definição de database model, database model matching, database model merging, integrated database model são propostos com a finalidade de apoiar a metodologia. Um algoritmo para database model matching e um algoritmo para database model merging são apresentados. Apresentamos ainda, um protótipo que adapta e estende as ferramentas MT4MDE e SAMT4MDE a fim de demonstrar a implementação do framework, metodologia e algoritmos propostos. Um exemplo ilustrativo ajuda a melhor entender a metodologia apresentada, servindo para explicar os metamodelos e algoritmos propostos neste trabalho. Uma breve avaliação do framework e diretrizes futuras sobre este trabalho são apresentadas.
|
93 |
Metamodel based multi-objective optimizationAmouzgar, Kaveh January 2015 (has links)
As a result of the increase in accessibility of computational resources and the increase in the power of the computers during the last two decades, designers are able to create computer models to simulate the behavior of a complex products. To address global competitiveness, companies are forced to optimize their designs and products. Optimizing the design needs several runs of computationally expensive simulation models. Therefore, using metamodels as an efficient and sufficiently accurate approximate of the simulation model is necessary. Radial basis functions (RBF) is one of the several metamodeling methods that can be found in the literature. The established approach is to add a bias to RBF in order to obtain a robust performance. The a posteriori bias is considered to be unknown at the beginning and it is defined by imposing extra orthogonality constraints. In this thesis, a new approach in constructing RBF with the bias to be set a priori by using the normal equation is proposed. The performance of the suggested approach is compared to the classic RBF with a posteriori bias. Another comprehensive comparison study by including several modeling criteria, such as problem dimension, sampling technique and size of samples is conducted. The studies demonstrate that the suggested approach with a priori bias is in general as good as the performance of RBF with a posteriori bias. Using the a priori RBF, it is clear that the global response is modeled with the bias and that the details are captured with radial basis functions. Multi-objective optimization and the approaches used in solving such problems are briefly described in this thesis. One of the methods that proved to be efficient in solving multi-objective optimization problems (MOOP) is the strength Pareto evolutionary algorithm (SPEA2). Multi-objective optimization of a disc brake system of a heavy truck by using SPEA2 and RBF with a priori bias is performed. As a result, the possibility to reduce the weight of the system without extensive compromise in other objectives is found. Multi-objective optimization of material model parameters of an adhesive layer with the aim of improving the results of a previous study is implemented. The result of the original study is improved and a clear insight into the nature of the problem is revealed.
|
94 |
Représentation de la variabilité des propriétés mécaniques d’un CMO à l’échelle microscopique : Méthodes de construction des distributions statistiques / Representation of CMO mechanical properties variability at the microscopic scale : Building methods of the statistical distributionsChermaneanu, Raducu 15 February 2012 (has links)
Aujourd’hui, les matériaux composites sont très largement utilisés, notamment dans la réalisation de structures aéronautiques, grâce à leurs nombreux avantages fonctionnels. Leurs caractéristiques mécaniques spécifiques (propriétés/masse volumique) nettement supérieures à d’autres matériaux plus classiques, tels que l’acier ou l’aluminium et la réalisation de formes complexes, font de ces matériaux des candidats très compétitifs dans de nombreux secteurs au-delà de l’aéronautique. Toutefois, ces matériaux présentent à différentes échelles d’observation des sources de variabilité caractéristiques à chacune d’entre elles. Le procédé de fabrication des pièces ainsi que les propriétés des constituants élémentaires en sont les principaux responsables. Trois niveaux (ou échelles) d’observation sont usuellement considérés dans les matériaux composites : l’échelle microscopique (fibres et matrice), l’échelle mésoscopique (pli) et enfin l’échelle macroscopique (stratification de plis). Les sources de variabilité se propagent à travers les échelles et génèrent finalement des comportements mécaniques dispersés à l’échelle de la structure. La prise en considération de cette variabilité s’avère alors pertinente pour le concepteur, désireux d’obtenir un indicateur de la fiabilité du matériau ou de la structure composite qu’il conçoit. Pour cela, il est nécessaire de transférer à moindre coût de calcul cette variabilité dès l’échelle microscopique et jusqu’à l’échelle de la structure. La construction de lois de distribution des propriétés mécaniques équivalentes en fonction de la variabilité présente à chaque échelle est alors indispensable. L’objectif de ce travail de recherche a été d’élaborer des distributions du comportement homogénéisé du matériau à l’échelle des fibres et de la matrice en fonction de la variabilité existante à cette échelle. La réduction du temps de calcul nécessaire à leur obtention a été également visée. À partir d’une observation microscopique réalisée sur une coupe d’un CMO, la variabilité morphologique du milieu hétérogène a été caractérisée et six types différents de motifs d’arrangements de fibres regroupés en cellules ont ainsi été identifiés. Des cellules virtuelles, physiquement raisonnables, ont été générées et proposées pour établir des lois de distribution du comportement équivalent par type de cellule, en fonction des paramètres variables pertinents retenus à cette échelle. En ce qui concerne la réduction du temps de calcul nécessaire à l’élaboration de ces lois de distribution, une démarche reposant sur l’utilisation des réseaux de neurones a été proposée. Cette démarche a été illustrée sur une cellule de type 6 et pour un nombre de 1000 calculs EF de référence, afin d’apprécier la qualité de l’approximation ainsi que la diminution du temps de calcul. La réduction du temps de calcul s’est avérée significative. Le gain du temps a été d’environ 95 %. / Nowadays, composite materials are very widely used, notably in the domain of aeronautical structures, thanks to their numerous functional benefits. Their specific mechanical properties (properties/density) far superior to those of conventional materials, such as steel or aluminum and the realization of complex shapes, make these materials perfect candidates in many areas beyond aviation. However, these materials present at different observation scales sources of variability peculiar to each one. The manufacturing process and the properties of the elementary constituents are in fact the principal cause of these sources of variability. Three levels (or scales) of observation are usually considered regarding composite materials: the microscopic scale (fibers and matrix), the mesoscopic scale (ply) and finally the macroscopic scale (laminate material). The sources of variability propagate trough the scales and finally generate dispersed mechanical behaviors at the structure scale. Taking into consideration these sources is proved to be a relevant work by the designer, which in turn will allow him to calculate an indicator of the composite structure reliability that he is conceiving. To be able to do the latter work, it is necessary to transfer this variability at a lower computational cost from the microscopic level up to the structure scale. The construction of equivalent mechanical properties distributions according to the variability present at each scale is then essential. The objective of this research work was to build statistical distributions of the homogenized behavior of the material at the scale of fibers and matrix, according to the existing variability at this scale. Minimizing the computation time required for obtaining these distributions was another important objective. From a microscopic observation made on a section of a CMO, the morphological variability of the heterogeneous medium has been characterized and six different types of arrangements patterns of fibers grouped into cells have then been identified. Physically reasonable virtual cells have been developed and suggested, in order to build the equivalent behavior distribution by cell type, according to the relevant variables selected at this scale. Now, in order to minimize the computing time required for the creation of these distributions, an approach based on neural networks was proposed. This approach was used for a type 6 cell and for a number of 1000 FE calculations, in order to evaluate the quality of the approximation as well as the reduction of computation time. Hence, the reduction of the computation time was significant, at an approximate rate of 95 %.
|
95 |
UMA ABORDAGEM PARA AVALIAÇÃO DA QUALIDADE DE ARTEFATOS DE SOFTWARE / AN APPROACH FOR ASSESSING THE QUALITY OF SOFTWARE ARTIFACTSBertuol, Gelson 27 August 2014 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / While applications and software systems have evolved and becoming more complex,
mainly due to the increasing demands of customers and users, organizations that produce or
acquire have sought alternatives to reduce costs and deliveries without affect the final product
quality. However, in order to make the evaluation of these products more effective, it is
important to use a quality model that allows structure it in a way that satisfies, among other
requirements, the heterogeneous expectations of stakeholders. At same time, it is recommended
starting this evaluation as soon as possible since the early stages of a development process in
order to detect and fix any problems before they propagate. In this sense, this work presents a
study on quality models used in the evaluation of software products, proposing at the same time
the assessment of software artifacts, generated and/or transformed by activities throughout the
lifecycle of a software process. The proposal is based on a quality framework, structured from
a metamodel, which relates the process of evaluating the several characteristics that involve the
artifacts, such as their purposes, stakeholders, methods and corresponding metrics. The work is
also composed by a supporting tool which purpose is to guide evaluators in defining a plan for
assessing the quality of those artifacts. Finally, the proposal was submitted to validation through
a case study involving graduate students of Federal University of Santa Maria. / Ao mesmo tempo em que as aplicações e os sistemas de software vêm evoluindo e
tornando-se mais complexos, devido, principalmente, à crescente exigência dos clientes e
usuários, as organizações que os produzem ou os adquirem têm buscado alternativas para
reduzir custos e prazos de entrega sem que a qualidade do produto final seja afetada. Contudo,
para que a avaliação desses produtos seja mais eficaz, é importante utilizar um modelo de
qualidade que permita estruturá-la de forma que satisfaça, entre outros requisitos, as
expectativas heterogêneas dos interessados. Paralelamente, recomenda-se iniciar essa avaliação
o mais cedo possível, já nos primeiros estágios de um processo de desenvolvimento com o
objetivo de detectar e corrigir os problemas encontrados antes que se propaguem. Neste sentido,
este trabalho apresenta um estudo sobre modelos de qualidade empregados na avaliação de
produtos de software ao mesmo tempo em que propõe a avaliação dos artefatos, gerados e/ou
transformados pelas atividades, ao longo do ciclo de vida de um processo de desenvolvimento.
A proposta é baseada em um framework de qualidade, estruturado a partir de um metamodelo,
que relaciona o processo de avaliação às diversas características que envolvem os artefatos, tais
como seus propósitos, interessados, métodos e métricas correspondentes. O trabalho é
composto, ainda, por uma ferramenta de apoio cujo objetivo é guiar os avaliadores na definição
de um plano de avaliação da qualidade de tais artefatos. Por fim, a proposta foi avaliada e
validada por meio de um estudo de caso envolvendo estudantes de pós-graduação em
informática na avaliação de três aplicações reais desenvolvidas por acadêmicos de graduação
da Universidade Federal de Santa Maria.
|
96 |
Composición y modelos exógenos : aplicación en la música contemporánea española : la metamodelización en Alberto Posadas (1967) y Hèctor Parra (1976) / Composition et modèles exogènes : application à la musique contemporaine espagnole / Composition and exogenous models applied to Spanish contemporary music : Alberto Posadas’s (1967) and Hèctor Parra’s (1976) metamodellingBesada Portas, José Luis 15 July 2015 (has links)
De nos jours, nombreux compositeurs s’inspirent de la science comme point de départ pour la création, bien qu’il n’est pas possible une normalisation de facto des pratiques détectables. Quelles dispositions cognitives et implications formelles minimales participeraient d’un passage transférentiel quelconque d’un modèle de la science vers un modèle musical pendant les pratiques créatives d’un compositeur ? Compte tenu de la question précédente, quelle approche analytique et méthodologique le musicologue envisageant à étudier les œuvres issues de ces pratiques compositionnelles devrait-il adopter ? La première partie de la thèse vise à construire le cadre théorique et analytique pour répondre ces deux questions. À cet égard, le discours invoque la Théorie des Modeles des mathématiques, la linguistique cognitive pour discuter les questions liées à la métaphore, et la notion de « métamodèle » des études en informatique et cognition.Étant données les questions de recherche, la thèse touche l’œuvre de auteurs espagnols en tant que cas d’étude pour ainsi tester ses postulats méthodologiques : d’une part, le castillan Alberto Posadas (Valladolid, 1967) ; d’autre part, le catalan Hèctor Parra (Barcelone, 1976). Le premier compositeur s’inspire notamment des mathématiques pour écrire sa musique : la thèse analyse la manière dont Posadas se sert de la géométrie fractale pour composer. Le deuxième compositeur emprunte des modèles de la physique et de la biologie : la thèse met l’accent sur ses pratiques créatives inspirées par les théories abiogénétiques de Cairns-Smith. L’analyse génétique de la musique est l’outil méthodologique employé à cette fin. / It is a fact that some composers find inspiration in science, but this fact is not at all normalized. What minimal cognitive dispositions and formal implications take part in any transferring from a scientific model to a musical one during composers’ creative practices? Given the previous question, what methodological and analytical stands should the musicologist take in order to efficiently address the study of musical works originating from those practices? The goal of this thesis is to develop the methodological and analytical frame in order to answer both questions. For that purpose, the discussion summons mathematical Model Theory, conceptual metaphor from cognitive linguistics and the term “metamodel” from computer and cognition studies.Those issues also lead to the three analysis of the work of two Spanish composers to test the theoretical proposition: Alberto Posadas (Valladolid, 1967) and Hèctor Parra (Barcelona, 1976). The first finds often his inspiration in mathematics: the dissertation analyses the way he uses fractal geometry to compose. The second one borrows inspiration from physics and biology: the thesis focuses on his creative practices linked to Cairns-Smith’s abiogenetic theories. Genetic analysis of music is the main methodological tool used for this purpose.
|
97 |
Umple C++ Code GeneratorSultan, Almaghthawi January 2013 (has links)
We discuss the design and analysis of a code generator for C++, implemented in the Umple model-oriented programming technology. Umple adds UML constructs and patterns to various base programming languages such as Java and PhP. Umple code generators create code for those constructs, which can include UML associations and state machines, as well as patterns such as immutable and singleton. Base language methods are passed through unchanged along with the generated code. Creating a C++ code generator for Umple posed many challenges, all of which are discussed in this thesis: We had to focus on the appropriate C++ idioms and stylistic conventions to follow. We followed a test-driven development process to ensure that the resulting code was correct. To evaluate the work, we compared our C++ generator with those in other tools such as ArgoUML and IBM Rational Software Architect. We conclude that our C++ generator is superior in many ways to these widely used tools because it is more complete and generates better quality code.
|
98 |
Métamodélisation et optimisation de dispositifs photoniques / Metamodeling and optimization of photonics devicesDurantin, Cédric 28 May 2018 (has links)
La simulation numérique est couramment utilisée pour étudier le comportement d’un composant et optimiser sa conception. Pour autant, chaque calcul est souvent coûteux en termes de temps et l’optimisation nécessite de résoudre un grand nombre de fois le modèle numérique pour différentes configurations du composant. Une solution actuelle pour réduire le temps de calcul consiste à remplacer la simulation coûteuse par un métamodèle. Des stratégies sont ensuite mises en place pour réaliser l’optimisation du composant à partir du métamodèle. Dans le cadre de cette thèse, trois dispositifs représentatifs des applications pouvant être traitées au sein du CEA LETI sont identifiés. L’étude de ces cas permet d’établir deux problématiques à résoudre. La première concerne la métamodélisation multi-fidélité, qui consiste à construire un métamodèle à partir de deux simulations du même composant ayant une précision différente. Les simulations sont obtenues à partir de différentes approximations du phénomène physique et aboutissent à un modèle appelé haute-fidélité (précis et coûteux) et un modèle basse fidélité (grossier et rapide à évaluer). Le travail sur cette méthode pour le cas de la cellule photoacoustique a amené au développement d’un nouveau métamodèle multifidélité basé sur les fonctions à base radiale. La deuxième problématique concerne la prise en compte des incertitudes de fabrication dans la conception de dispositifs photoniques. L’optimisation des performances de composants en tenant compte des écarts observés entre la géométrie désirée et la géométrie obtenue en fabrication a nécessité le développement d’une méthode spécifique pour le cas du coupleur adiabatique. / Numerical simulation is widely employed in engineering to study the behavior of a device and optimize its design. Nevertheless, each computation is often time consuming and, during an optimization sequence, the simulation code is evaluated a large number of times. An interesting way to reduce the computational burden is to build a metamodel (or surrogate model) of the simulation code. Adaptive strategies are then set up for the optimization of the component using the metamodel prediction. In the context of this thesis, three representative devices are identified for applications that can be encountered within the CEA LETI optics and photonics department. The study of these cases resulted in two problems to be treated. The first one concerns multifidelity metamodeling, which consists of constructing a metamodel from two simulations of the same component that can be hierarchically ranked in accuracy. The simulations are obtained from different approximations of the physical phenomenon. The work on this method for the case of the photoacoustic cell has generated the development of a new multifidelity surrogate model based on radial basis function. The second problem relate to the consideration of manufacturing uncertainties in the design of photonic devices. Taking into account the differences observed between the desired geometry and the geometry obtained in manufacturing for the optimization of the component efficiency requires the development of a particular method for the case of the adiabatic coupler. The entire work of this thesis is capitalized in a software toolbox.
|
99 |
Towards multifidelity uncertainty quantification for multiobjective structural designLebon, Jérémy 12 December 2013 (has links)
This thesis aims at Multi-Objective Optimization under Uncertainty in structural design. We investigate Polynomial Chaos Expansion (PCE) surrogates which require extensive training sets. We then face two issues: high computational costs of an individual Finite Element simulation and its limited precision. From numerical point of view and in order to limit the computational expense of the PCE construction we particularly focus on sparse PCE schemes. We also develop a custom Latin Hypercube Sampling scheme taking into account the finite precision of the simulation. From the modeling point of view, we propose a multifidelity approach involving a hierarchy of models ranging from full scale simulations through reduced order physics up to response surfaces. Finally, we investigate multiobjective optimization of structures under uncertainty. We extend the PCE model of design objectives by taking into account the design variables. We illustrate our work with examples in sheet metal forming and optimal design of truss structures. / Doctorat en Sciences de l'ingénieur / info:eu-repo/semantics/nonPublished
|
100 |
Multidisciplinary Design Optimization of Automotive StructuresDomeij Bäckryd, Rebecka January 2013 (has links)
Multidisciplinary design optimization (MDO) can be used as an effective tool to improve the design of automotive structures. Large-scale MDO problems typically involve several groups who must work concurrently and autonomously for reasons of efficiency. When performing MDO, a large number of designs need to be rated. Detailed simulation models used to assess automotive design proposals are often computationally expensive to evaluate. A useful MDO process must distribute work to the groups involved and be computationally efficient. In this thesis, MDO methods are assessed in relation to the characteristics of automotive structural applications. Single-level optimization methods have a single optimizer, while multi-level optimization methods have a distributed optimization process. Collaborative optimization and analytical target cascading are possible choices of multi-level optimization methods for automotive structures. They distribute the design process, but are complex. One approach to handle the computationally demanding simulation models involves metamodel-based design optimization (MBDO), where metamodels are used as approximations of the detailed models during optimization studies. Metamodels can be created by individual groups prior to the optimization process, and therefore also offer a way of distributing work. A single-level optimization method in combination with metamodels is concluded to be the most straightforward way of implementing MDO into the development of automotive structures.
|
Page generated in 0.0636 seconds