Spelling suggestions: "subject:"model driven engineering"" "subject:"godel driven engineering""
1 |
The Effects of Gestalt Principles on Diagram Comprehension: An Empirical ApproachWilson, Krystle Dianne 15 December 2012 (has links)
In the software engineering process some tasks of software engineers are to design software documents, analyze the documents, and comprehend component relationships within software diagrams. Those diagrams represent the software architecture which models the structure, behavior, relationships, and constraints among system components while ignoring implementation detail. In the software lifecycle, the system is implemented from the software architecture and errors and mistakes caused from a lack of comprehension or incorrect comprehension could cause engineers to incorrectly design the system. These errors can be defined as lapses, slips, or lack of understanding and fall into three categories: skill, rule, and knowledge errors. The Gestalt principles of organization, from the cognitive science domain, deal with how humans perceive the world around them. This dissertation seeks to identify whether the Gestalt principles of continuity, similarity of size, proximity, and similarity of name affect comprehension of the Unified Modeling Language (UML) class diagrams. Diagram comprehension is measured by response time and subject accuracy on questions and the mental workload perceived by subjects while answering questions related to the diagrams. The research hypotheses are diagrams that utilize the Gestalt principles of continuity, similarity of size, proximity, and similarity of name will have faster response times, higher accuracy, and lower mental workload scores than diagrams that do not use the Gestalt principles. The results of the research indicate that the Gestalt principle of proximity helped ease diagram comprehension. Through the use of this design principle, the Gestalt principle of continuity is applied because line crossings, line bends, and line length are minimized. Subjects were prone to make more errors on knowledge based questions that dealt with system understanding and UML semantics than skill and rule questions that dealt with system connections and UML syntax. These results provide software designers heuristics that can lead to better diagram design and identifies software engineering tasks that can lead to more errors.
|
2 |
SimITK: Model Driven Engineering for Medical ImagingTrezise, Melissa 06 August 2013 (has links)
The Insight Segmentation and Registration Toolkit (ITK) is a highly utilized open source medical imaging library. Written in C++, ITK provides chiefly the functionality to register, segment, and filter medical images. Although extremely powerful, ITK has a very steep learning curve for users with little or no background in programming. It was for this reason that SimITK was developed. SimITK wraps ITK into the model driven engineering environment Simulink, a part of the Matlab development suite. The first released version of SimITK was a proof of concept, and demonstrated that ITK could be wrapped successfully in Simulink. Very few segmentation and registration functions were available and the system was based on ITK version 3 with a semi-automatic wrapping procedure.
In this thesis a new version of SimITK is presented that includes thirty-seven image filter, twelve optimizer, and nineteen transform classes from ITK version 4 which are successfully wrapped and tested. These classes were chosen to represent a broad range of usability (in the case of the filters) and to allow for greater flexibility when creating registration pipelines by having more options for optimizers, transforms, and metrics. Many usability improvements were also implemented for the registration pipeline, including providing the user with the metric value while executing a registration model and allowing the output image size to be specified for certain filters. In order for SimITK to transition to a usable research tool, several usability improvements were needed. These included transitioning from wrapping ITK version 3 to ITK version 4, fully automating the wrapping process, and usability modifications to the registration pipeline including a metric value output.
These implementations of an automated wrapping procedure for ITK version 4, and improved usability of the registration pipeline have propelled SimITK on a path towards a usable research tool. The author will be creating a release of these changes, updating installation documentation, and updating tutorials which are available at www.SimITKVTK.com / Thesis (Master, Computing) -- Queen's University, 2013-08-05 10:15:16.607
|
3 |
Interprétations de la composition d'activitésBlay-Fornarino, Mireille 16 April 2009 (has links) (PDF)
Nos travaux de recherche portent sur la composition fiable d'activités logicielles. Ils s'inscrivent dans le cadre général du génie logiciel, et concernent les étapes de conception, implantation et adaptations d'applications logicielles réparties. Le thème central développé est la modélisation des activités et leur composition tout en garantissant différentes propriétés. La spécificité de ce travail et son originalité est d'avoir suivi une approche de la composition non pas dirigée par les langages mais par les éléments clefs de la composition indépendamment de mises en œuvres spécifiques. En fondant ce travail sur une approche formelle, nous proposons une vision unifiée d'un procédé de composition, et caractérisons les différences en terme d'interprétations. En étayant cette formalisation par plusieurs mises en œuvre à la fois en terme d'environnement de programmation et d'applications nous ancrons cette approche dans une réalité fonctionnelle. Les principales applications de ce travail portent sur la composition d'interactions entre composants hétérogènes et la composition de Workflows.
|
4 |
A framework for the analysis of failure behaviors in component-based model-driven development of dependable systemsJaved, Muhammad Atif, Faiz UL Muram, Faiz UL Muram January 2011 (has links)
Currently, the development of high-integrity embedded component-based software systems is not supported by well-integrated means allowing for quality evaluation and design support within a development process. Quality, especially dependability, is very important for such systems. The CHESS (Composition with Guarantees for High-integrity Embedded Software Components Assembly) project aims at providing a new systems development methodology to capture extra-functional concerns and extend Model Driven Engineering industrial practices and technology approaches to specifically address the architectural structure, the interactions and the behavior of system components while guaranteeing their correctness and the level of service at run time. The CHESS methodology is expected to be supported by a tool-set which consists of a set of plug-ins integrated within the Eclipse IDE. In the framework of the CHESS project, this thesis addresses the lack of well integrated means concerning quality evaluation and proposes an integrated framework to evaluate the dependability of high-integrity embedded systems. After a survey of various failure behavior analysis techniques, a specific technique, called Failure Propagation and Transformation Calculus (FPTC), is selected and a plug-in, called CHESS-FPTC, is developed within the CHESS tool-set. FPTC technique allows users to calculate the failure behavior of the system from the failure behavior of its building components. Therefore, to fully support FPTC, CHESS-FPTC plug-in allows users to model the failure behavior of the building components, perform the analysis automatically and get the analysis results back into their initial models. A case study about AAL2 Signaling Protocol is presented to illustrate and evaluate the CHESS-FPTC framework. / CHESS Project - http://chess-project.ning.com/
|
5 |
TEST DERIVATION AND REUSE USING HORIZONTAL TRANSFORMATION OF SYSTEM MODELSKAVADIYA, JENIS January 2010 (has links)
No description available.
|
6 |
A domain specific language to support the definition of transformation rules for software process tailoringSilvestre Quiroga, Luis Gregorio January 2018 (has links)
Doctor en Ciencias, Mención Computación / La adaptación de procesos de software es la actividad de adaptar el proceso de software de una organización a las necesidades de proyectos particulares. La ingeniería basada en modelos (MDE) se ha aplicado con este fin, utilizando modelos para formalizar el proceso de software y el contexto del proyecto, y transformaciones del modelo para adaptar estos procesos.
A pesar de que la adaptación basada en MDE ha demostrado ser técnicamente factible, su uso en la práctica requiere conocimiento sobre cómo adaptar procesos y construir modelos y transformaciones.
Existen algunas propuestas para la generación automática de transformaciones como una forma de reducir la complejidad de adaptar los procesos de software. Estas propuestas generalmente generan transformación solo parcialmente, y luego deben completarse manualmente. Estos enfoques no son adecuados para la adaptación de procesos de software porque no superan por completo las dificultades técnicas de adopción.
Para enfrentar estos desafíos esta tesis propone un enfoque automático de generación de transformaciones, que aborda tanto la formalidad requerida por MDE, como la usabilidad que necesitan los ingenieros de proceso a cargo de las adaptaciones. Para ello, especificamos las reglas de adaptación utilizando un lenguaje específico de dominio (DSL). Además, definimos una transformación de orden superior (HOT) que toma las reglas de adaptación especificadas como entrada y genera automáticamente la transformación de adaptación de procesos requerida. Tanto el DSL como el HOT son genéricos y, por lo tanto, pueden reutilizarse en cualquier organización. Con el fin de mejorar la usabilidad, desarrollamos un conjunto de herramientas integradas (ATAGeTT) que incorpora ambas contribuciones.
ATAGETT se aplicó en un estudio de caso exploratorio en dos pequeñas empresas de software, para evaluar su capacidad y corrección de adaptar el proceso de estas compañías. Los resultados obtenidos muestran queusuarios pudieron especificar todas las reglas de adaptación requeridas.
Luego, se llevó cabo un caso de estudio en otra empresa para validar la usabilidad de ATAGeTT y la expresividad del lenguaje de decisión propuesto. Los usuarios pudieron especificar todas las reglas de ajuste de una manera simple, y de ejecutar la adaptación de procesos de manera automática. Los resultados muestran que ATAGeTT es fácil de aprender, usable y útil para sus potenciales usuarios. Aunque los resultados aún no son suficientes, son altamente positivos y consistentes; por lo tanto, esperamos que esta propuesta pueda ayudar a mejorar esta actividad, particularmente en organizaciones pequeñas y medianas, que generalmente están más limitadas para realizar adaptacion de procesos de software. / Este trabajo ha sido financiada por la beca CONICYT-PFCHA/Doctorado Nacional para Extranjeros/2013-63130130, y apoyada parcialmente por los proyectos FONDEF D09I-1171 /ADAPTE) y FONDEF IDeA IT13I20010 (GEMS), y el Programa de Becas de NIC Chile
|
7 |
TEST DERIVATION AND REUSE USING HORIZONTAL TRANSFORMATION OF SYSTEM MODELSKAVADIYA, JENIS January 2010 (has links)
No description available.
|
8 |
Calcul intensif pour l'évaluation de la vulnérabilité en utilisant une approche d'Ingénierie Dirigée par les Modèles : application à la vulnérabilité des prairies au changement climatique sous contraintes de plans d'expériences / Intensive calculation for vulnerability assessment using a Model Driven Engineering approach : application to prairie vulnerability to climate change under experimental design constraintsLardy, Romain 13 May 2013 (has links)
La vulnérabilité est le degré de stress ou de perturbation qu’un système humain et environnemental est capable de supporter avant d’être endommagé. Au cours des dernières années, c’est devenu un sujet central de l’étude du changement global (incluant le changement climatique), et l’évaluation de la vulnérabilité des agro-écosystèmes aux changements climatiques s’inscrit en effet dans les axes prioritaires de l’Institut National de la Recherche Agronomique (INRA). La littérature sur le changement climatique contient de nombreuses explications de la vulnérabilité, basée sur la notion de sensibilité et faisant éventuellement appel à des idées plus complexes, depuis la prise en compte de l’exposition jusqu’aux impacts résiduels du changement climatique après mise en place de mesures d’adaptation. Dans le cadre des activités de l’Unité de Recherche sur les Ecosystèmes Prairiaux (UREP), l’intérêt porte sur la vulnérabilité des systèmes prairiaux et d’élevage face aux risques de réduction de la production laitière et herbagère, ainsi que la problématique de l’augmentation des émissions de gaz à effet de serre qui accompagne la production des services d’un écosystème prairial.Compte-tenu du fait que l’analyse de vulnérabilité s’apparente à une forme d’analyse de sensibilité et de la cascade d’incertitudes lors des évaluations des impacts du changement climatique, un grand nombre de simulations sont nécessaires. Dans ce cadre, afin de réduire le temps d’attente utilisateur, la conception d’un plan expérimental approprié est nécessaire, ainsi que l’utilisation du calcul à haute performance. De plus, l’analyse de vulnérabilité peut se composer de nombreuses étapes parmi lesquelles on trouve la conception (choix du modèle agro-écologique, des variables d’intérêt, des scénarii et des seuils de référence, la distribution des paramètres, …), la génération de plans expérimentaux, la création de surfaces de réponse, le calcul de métriques (par exemple des indices de vulnérabilité) et l’optimisation (à travers la conception et l’évaluation de mesures d’adaptation) de la vulnérabilité. À notre connaissance, aucun outil spécifique n’a été conçu ou validé dans le but de faciliter l’exécution de la majeure partie des tâches énoncées précédemment. Ainsi, la problématique de cette thèse a été de proposer une méthode générique pour la réalisation d’une analyse de vulnérabilité sous changement climatique. Le travail réalisé dans cette thèse a donc commencé par une révision du concept de vulnérabilité et la proposition d’une démarche générique, en s’appuyant sur une synthèse critique des notions de l’état de l’art. Ensuite, avec une approche d’Ingénierie Dirigée par les Modèles, nous avons conçu un outil informatique pour l’analyse de vulnérabilité. Cet outil, implémenté avec le cadriciel Eclipse Modeling Framework (EMF),est générique, modulaire et permet la distribution et l’interprétation des simulations. Enfin,des exemples applicatifs d’analyse de vulnérabilité au changement climatique ont été réalisés à l’aide des solutions proposées précédemment. Cette démarche s’est notamment appuyée sur l’utilisation du modèle biogéochimique d’écosystème prairial PaSim ([Riedo et al., 1998], [Vuichard 2007a], [Graux 2011]). / Vulnerability is the degree to which human or environmental systems are likely toexperience harm due to a perturbation or a stress. In the last years, it has become a centralfocus of the global change (including climate change). Assessing the vulnerability of agroecosystemsto climate change is one of the priority areas of the French National Institute ofAgronomic Research (INRA). The climate change literature contains many explanations ofvulnerability, stemming from the notion of sensitivity to more complex ideas, yet takinginto account the exposure history of the system up to residual impacts of climate changeafter adaptation. In the framework of the activities of the INRA’s Grassland EcosystemResearch Unit (UREP) of Clermont-Ferrand, interest is on vulnerability of grassland andlivestock systems against the risk of reduced milk and forage production, and against theproblem of increased greenhouse gas emissions that comes with the production ofgrassland ecosystem services.Vulnerability assessment has similarities with sensitivity analysis and is based onsimulations of the target system, forced to respond to the changes of stress factors. Due tothe cascade of uncertainties in climate change impacts assessment, a large number ofsimulations are necessary. In this context, the need to reduce user waiting time calls for theconception of an appropriate experimental plan, as well as the use of high performancecomputing. Moreover, vulnerability assessment may consist of many steps, such asdesigning the experiment (choice of agro-ecological model, variables of interest, scenarios,reference thresholds, parameters distribution …), designing of the experimental plans,regressing response surfaces, computing metrics (e.g. vulnerability indices) and optimizingvulnerability (through designing and evaluating adaptation measures). To our knowledge,no specific tool has been built or validated, in order to facilitate the implementation ofmost of these tasks. Thus, the goal of this thesis was to propose a generic method toperform a comprehensive vulnerability analysis to climate change. The work in this thesishas begun with a review of the concept of vulnerability and the proposal of a genericapproach, based on a critical synthesis of the state of the art. Then, with a Model DrivenEngineering approach, we have developed a computer tool for vulnerability analysis. Thistool, implemented with the modelling framework Eclipse Modeling Framework (EMF) isgeneric, modular and allows the distribution and interpretation of simulation results.Finally, application examples of climate change vulnerability assessment were achievedwith the previously proposed solutions. This approach relied, in particular, on the use ofthe grassland ecosystem biogeochemical model PaSim ([Riedo et al., 1998], [Vuichard2007a], [Graux 2011]).
|
9 |
Génération stratégique de code pour la maîtrise des performances de systèmes temps-réel embarqués / Strategic generation of code to master the performances of real-time embedded systemsCadoret, Fabien 26 May 2014 (has links)
Nous nous sommes intéressés aux systèmes embarqués temps-réel critiques (SETRC) qui soulèvent des problématiques de criticité, de respect de contraintes temporelles et de disponibilité des ressources telles que la mémoire. Pour maîtriser la complexité de conception de ces systèmes, l’Ingénierie Dirigée par les Modèles (IDM) propose de les modéliser pour les analyser au regard de leurs exigences et pour générer en partie leur code d’exécution. Cependant ces deux phases doivent s’articuler correctement de sorte que le système généré respecte toujours les propriétés du modèle initialement analysé. Par ailleurs, le générateur de code doit s’adapter à de multiples critères : notamment pour assurer le respect des performances ou bien pour cibler différentes plates-formes d’exécution qui ont leurs propres contraintes et sémantiques d’exécution. Pour réaliser cette adaptation, le processus de développement requiert de faire évoluer les règles de transformation selon ces critères. Son architecture doit également de permettre de sélectionner les composants logiciels répondant à ces critères. Nous répondons à cette problématique en proposant un processus de génération s’appuyant sur l’IDM. Lorsque l’utilisateur a spécifié et validé un modèle de haut niveau, une transformation traduit automatiquement ce modèle en un second modèle détaillé proche du code généré. Pour assurer la conservation des exigences, le modèle détaillé est exprimé dans le même formalisme que le modèle initial de sorte qu’il reste analysable. Cette démarche détermine l’impact de la stratégie du générateur sur les performances du système final et permet au générateur de changer de stratégie, à une étape donnée, pour assurer le respect des contraintes du système. Pour faciliter le développement et la sélection de stratégies alternatives, nous proposons une méthodologie qui s’articule autour d’un formalisme pour l’orchestration des transformations, un ensemble de patrons de transformation (qui factorisent et généralisent les règles de transformation) et une adaptation de composants logiciels selon leur impact sur les performances. Nous avons mis en place ce processus au sein de l’environnement OSATE, pour lequel nous avons développé le framework RAMSES (Refinment of AADL Models for Synthesis of Embedded Systems). Nous l’avons expérimenté sur la génération des communications entre tâches pour lesquelles plusieurs stratégies d’implémentation ont été définies / We focused on real-time embedded critical systems (RTECS) which present different problems: criticality, respect of time constraints and resources availability such as memory. In order to master design complexity of such systems, Model Driven Engineering (MDE) proposes to model it for analysis purposes and to generate, partially or totally, its execution code. However, these two phases must be correctly connected to ensure the generated code is always enforcing all the properties of the model initially analysed. In addition, the code generator must be adapted to several criteria: in particular to ensure respect of performances or to target different execution platforms which have their own execution constraints and semantics. To realize such an adaptation, the development process requires to evolve transformation rules according to these criteria. Its architecture needs also to allow the selection of the generated software components respecting these criteria.We answer such a problem by proposing a generation process based on the MDE. When the user specifies and validates a high-level model, a model transformation translates automatically this model into a detailed model close to the generated code. To ensure the conservation of the requirements, the detailed model is expressed in the same formalism as the initial model so that it remains analysable (by the same tools initially used). This approach determines the impact of the code generation strategy on the performances of the final system and allows the generator to adapt its strategy, in a given stage, to insure the respect of the system constraints. To facilitate the development and the selection of alternative strategies, we propose a methodology which articulates around a formalism for the orchestration of the transformations, a set of transformation patterns (which factorize and generalize the transformation rules) and an adaptation of software components according to their impact on the performances. We set up this process within the environment OSATE, for which we have developed the framework RAMSES (Refinement of AADL Models for Synthesis of Embedded Systems). We have experimented it on the code generation of the communications between tasks for which several strategies of implementation were defined.
|
10 |
Industrialising software development in systems integrationMinich, Matthias Ernst January 2013 (has links)
Compared to other disciplines, software engineering as of today is still dependent on craftsmanship of highly-skilled workers. However, with constantly increasing complexity and efforts, existing software engineering approaches appear more and more inefficient. A paradigm shift towards industrial production methods seems inevitable. Recent advances in academia and practice have lead to the availability of industrial key principles in software development as well. Specialization is represented in software product lines, standardization and systematic reuse are available with component-based development, and automation has become accessible through model-driven engineering. While each of the above is well researched in theory, only few cases of successful implementation in the industry are known. This becomes even more evident in specialized areas of software engineering such as systems integration. Today’s IT systems need to quickly adapt to new business requirements due to mergers and acquisitions and cooperations between enterprises. This certainly leads to integration efforts, i.e. joining different subsystems into a cohesive whole in order to provide new functionality. In such an environment. the application of industrial methods for software development seems even more important. Unfortunately, software development in this field is a highly complex and heterogeneous undertaking, as IT environments differ from customer to customer. In such settings, existing industrialization concepts would never break even due to one-time projects and thus insufficient economies of scale and scope. This present thesis, therefore, describes a novel approach for a more efficient implementation of prior key principles while considering the characteristics of software development for systems integration. After identifying the characteristics of the field and their affects on currently-known industrialization concepts, an organizational model for industrialized systems integration has thus been developed. It takes software product lines and adapts them in a way feasible for a systems integrator active in several business domains. The result is a three-tiered model consolidating recurring activities and reducing the efforts for individual product lines. For the implementation of component-based development, the present thesis assesses current component approaches and applies an integration metamodel to the most suitable one. This ensures a common understanding of systems integration across different product lines and thus alleviates component reuse, even across product line boundaries. The approach is furthermore aligned with the organizational model to depict in which way component-based development may be applied in industrialized systems integration. Automating software development in systems integration with model-driven engineering was found to be insufficient in its current state. The reason herefore lies in insufficient tool chains and a lack of modelling standards. As an alternative, an XML-based configuration of products within a software product line has been developed. It models a product line and its products with the help of a domain-specific language and utilizes stylesheet transformations to generate compliable artefacts. The approach has been tested for its feasibility within an exemplarily implementation following a real-world scenario. As not all aspects of industrialized systems integration could be simulated in a laboratory environment, the concept was furthermore validated during several expert interviews with industry representatives. Here, it was also possible to assess cultural and economic aspects. The thesis concludes with a detailed summary of the contributions to the field and suggests further areas of research in the context of industrialized systems integration.
|
Page generated in 0.1117 seconds