• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 12
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 61
  • 20
  • 14
  • 14
  • 14
  • 13
  • 13
  • 10
  • 8
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

IMPERATIVE MODELS TO DECLARATIVE CONSTRAINTS : Generating Control-Flow Constraints from Business Process Models

Bergman Thörn, Arvid January 2023 (has links)
In complex information systems, it is often crucial to evaluate whether a sequence of activities obtained from a system log complies with behavioural rules. This process of evaluation is called conformance checking, and the most classical approach to specifying the behavioural rules is in the form of flow chartlike process diagrams, e.g., in the Business Process Model and Notation (BPMN) language. Traditionally, control flow constraints are extracted using Petri net replay-based approaches. Though, with the use of industrial process query languages such as Signavio Analytics Language (SIGNAL) that allows for temporal row matching, the possibility of performing conformance checking using temporal constraints opens up. To this end, this thesis presents a parser for extracting control-flow objects from BPMN-based business process models and a compiler for generating both linear temporal logic-like rules as well as SIGNAL queries. The parser succeeds at parsing all industry models and most academic models; the exceptions in the latter case can presumably be traced back to edge cases and unidiomatic modelling. The constraints generated by the compiler are in some, but not in all cases, identical to constraints extracted via Petri net replay as an intermediate step, indicating some differences in the formal interpretation of BPMN control flow. In conclusion, the implementation and evaluation of the parser and compiler indicate that it is feasible to move directly from business user-oriented process models to declarative, query language-based constraints, cutting out the Petri net-replay middleman and hence facilitating elegant and more efficient process data querying.
42

Breakdown Voltage Performances of Aluminum and Copper Conductor Wire Under Compression Stresses

Suchanek, Richard Donald, III 01 April 2016 (has links)
In the global, competitive market of energy transformation, increased operational expenses and depletion of raw materials have resulted in companies pursuing alternate materials to reduce consumer costs. In electrical applications, energy is transformed using materials with high electrical conductive properties. The conductive material used to transmit a signal is called conductor wire and is comprised of any material that has the ability to move charged particles from one point to another without propagation or delay. The conductor wire in many applications is encapsulated in epoxy resin called enamel. The enamel is the insulation system that provides necessary dielectric clearances to prevent voltage leakage. The most common form of energy transformation is the electric motor. Both copper and aluminum conductor wire are commonly used in electric motors, but copper is preferred due to thermal and electrical properties. However, there is a significant economic incentive to convert to aluminum conductor wire. Limited white papers are available comparing the performances of the two materials; the research is limited to physical and electrical performances of the raw material and does not take into considering the insulation. The conductor wire, which includes the insulation, is susceptible to damage during the manufacturing process and is an inherent risk if not fully understood. During the blocking process, the conductor wire is pushed and compressed into lamination slots. This process changes the conductor wire outer diameter to accommodate void spaces within the lamination slots. The percentage of slot area occupied by the conductor wire is known as slot fill. The higher the slot fill, the more wire occupying the available space. The higher the slot fill, the more force required to fill the slots. High slot fill motor designs provide a performance advantage with little associated cost. The more wire pressed into the slot, the higher the potential efficiency gains. However, high slot fill motors are more susceptible to damage. The study is designed to evaluate and measure the durability of aluminum and copper conductor wire under simulated compression stresses. Utilizing this information, electric motor manufacturers can push current design limits without affecting conductor wire quality and reliability.
43

Validation fonctionnelle de contrôleurs logiques : contribution au test de conformité et à l'analyse en boucle fermée / Functional validation of logic controllers : contribution to conformance test and closed-loop analysis

Guignard, Anaïs 04 December 2014 (has links)
Les travaux présentés dans ce mémoire de thèse s'intéressent à la validation fonctionnelle de contrôleurs logiques par des techniques de test de conformité et de validation en boucle fermée. Le modèle de spécification est décrit dans le langage industriel Grafcet et le contrôleur logique est supposé être un automate programmable industriel (API) mono-tâche. Afin de contribuer à ces techniques de validation fonctionnelle, ces travaux présentent : - Une extension d'une méthode de formalisation du Grafcet par traduction sous la forme d'une machine de Mealy. Cette extension permet de produire un modèle formel de la spécification lorsque le Grafcet est implanté selon un mode d'interprétation sans recherche de stabilité, qui n'est pas préconisé dans la norme IEC 60848 mais largement utilisé dans les applications industrielles. - Une contribution au test de conformité par la définition d'un ensemble de relations de conformité basées sur l'observation de plusieurs cycles d'exécution pour chaque pas de test. - Une contribution à la validation en boucle fermée par la définition d'un critère de fin d'observation et par une technique d'identification en boite grise pour la construction et l'analyse du système en boucle fermée. / The results presented in this PhD thesis deal with functional validation of logic controllers using conformance test and closed-loop validation techniques. The specification model is written in the Grafcet language and the logic controller is assumed to be a Programmable Logic Controller (PLC). In order to contribute to these validation techniques, this thesis presents: - An axtension to a fomalization methods for Grafcet languages by translation to a Mealy machine. This extension generates a formal model of a Grafcet specification that is interpreted without search of stability. This mode of interpretation is not recommended by the standard IEC 60848 but is widely used in industrial applications. - A contribution to conformance test by a definition of a set of conformance relation based on the observation of several execution cycles for each test step. - A contribution to closed-loop validation by the definition of a termination criterion and by a new gray-box identification technique that is used for construction and analysis of the closed-loop system.
44

A symbolic-based passive testing approach to detect vulnerabilities in networking systems / [Une approche symbolique basée sur des tests passifs pour détecter les vulnérabilités des systèmes réseaux]

Mouttappa, Pramila 16 December 2013 (has links)
En raison de la complexité croissante des systèmes réactifs, le test est devenu un des éléments essentiels dans le processus de leur développement. Les tests de conformité avec des méthodes formelles concernent la correction du contrôle fonctionnel, par le biais des tests d'un système en boîte noire avec une spécification formelle du système. Les techniques passives de test sont utilisées lorsque l’exécution des systèmes testés ne peut pas être perturbée ou l'interface du système n'est pas fournie. Les techniques passives de test sont fondées sur l'observation et la vérification des propriétés du comportement d'un système sans interférer avec son fonctionnement normal. Les tests contribuent également à établir les comportements anormaux pendant l’exécution sur la base de l'observation de toute déviation d'un comportement prédéterminé. L'objectif principal de cette thèse est de présenter une nouvelle approche pour la mise en place des tests passifs fondés sur l'analyse des parties contrôle et données du système sous test. Au cours des dernières décennies, de nombreuses théories et outils ont été développés pour effectuer les tests de conformité. De fait, les spécifications ou les propriétés des systèmes réactifs sont souvent modélisés par différentes variantes de Labeled Transition Systems (LTS). Toutefois, ces méthodes ne prennent pas explicitement en compte les parties données du système, étant donné que le modèle sous-jacent de LTS n’est pas en mesure de le faire. Par conséquent, avec ces approches il est nécessaire d'énumérer les valeurs des données avant la modélisation du système. Cela conduit souvent au problème de l'explosion combinatoire de l'état-espace. Pour palier à cette limitation, nous avons étudié un modèle appelé Input-Output Symbolic Transition Systems (IOSTS) qui inclut explicitement toutes les données d'un système réactif. De nombreuses techniques de tests passives prennent uniquement en considération la partie du contrôle du système en négligeant les données, ou elles sont confrontées à une quantité énorme de données du processus. Dans notre approche, nous prenons en compte la partie contrôle et données en intégrant les concepts d'exécution symbolique et nous améliorons l'analyse de traces en introduisant des techniques de slicing des traces d’exécution. Les propriétés sont décrites à l'aide d'IOSTS et nous illustrons dans notre approche comment elles peuvent être testées sur l'exécution réelle des traces en optimisant l'analyse. Ces propriétés peuvent être conçues pour tester la conformité fonctionnelle d'un protocole ainsi que des propriétés de sécurité. Au-delà de l'approche théorique, nous avons développé un outil logiciel qui implémente les algorithmes présentés dans nos travaux. Enfin, comme preuve de concept de notre approche et de l'outil logiciel, nous avons appliqué les techniques à deux études de cas réels : le protocole SIP et le protocole Bluetooth / Due to the increasing complexity of reactive systems, testing has become an important part in the process of the development of such systems. Conformance testing with formal methods refers to checking functional correctness, by means of testing, of a black-box system under test with respect to a formal system specification, i.e., a specification given in a language with a formal semantics. In this aspect, passive testing techniques are used when the implementation under test cannot be disturbed or the system interface is not provided. Passive testing techniques are based on the observation and verification of properties on the behavior of a system without interfering with its normal operation, it also helps to observe abnormal behavior in the implementation under test on the basis of observing any deviation from the predefined behavior. The main objective of this thesis is to present a new approach to perform passive testing based on the analysis of the control and data part of the system under test. During the last decades, many theories and tools have been developed to perform conformance testing. However, in these theories, the specifications or properties of reactive systems are often modeled by different variants of Labeled Transition Systems (LTS). However, these methodologies do not explicitly take into account the system's data, since the underlying model of LTS are not able to do that. Hence, it is mandatory to enumerate the values of the data before modeling the system. This often results in the state-space explosion problem. To overcome this limitation, we have studied a model called Input-Output Symbolic Transition Systems (IOSTS) which explicitly includes all the data of a reactive system. Many passive testing techniques consider only the control part of the system and neglect data, or are confronted with an overwhelming amount of data values to process. In our approach, we consider control and data parts by integrating the concepts of symbolic execution and we improve trace analysis by introducing trace slicing techniques. Properties are described using Input Output Symbolic Transition Systems (IOSTSs) and we illustrate in our approach how they can be tested on real execution traces optimizing the trace analysis. These properties can be designed to test the functional conformance of a protocol as well as security properties. In addition to the theoretical approach, we have developed a software tool that implements the algorithms presented in this paper. Finally, as a proof of concept of our approach and tool we have applied the techniques to two real-life case studies: the SIP and Bluetooth protocol
45

Model-Based Test Case Generation for Real-Time Systems

Hessel, Anders January 2007 (has links)
<p>Testing is the dominant verification technique used in the software industry today. The use of automatic test case execution increases, but the creation of test cases remains manual and thus error prone and expensive. To automate generation and selection of test cases, model-based testing techniques have been suggested.</p><p>In this thesis two central problems in model-based testing are addressed: the problem of how to formally specify coverage criteria, and the problem of how to generate a test suite from a formal timed system model, such that the test suite satisfies a given coverage criterion. We use model checking techniques to explore the state-space of a model until a set of traces is found that together satisfy the coverage criterion. A key observation is that a coverage criterion can be viewed as consisting of a set of items, which we call coverage items. Each coverage item can be treated as a separate reachability problem. </p><p>Based on our view of coverage items we define a language, in the form of parameterized observer automata, to formally describe coverage criteria. We show that the language is expressive enough to describe a variety of common coverage criteria described in the literature. Two algorithms for test case generation with observer automata are presented. The first algorithm returns a trace that satisfies all coverage items with a minimum cost. We use this algorithm to generate a test suite with minimal execution time. The second algorithm explores only states that may increase the already found set of coverage items. This algorithm works well together with observer automata.</p><p>The developed techniques have been implemented in the tool CoVer. The tool has been used in a case study together with Ericsson where a WAP gateway has been tested. The case study shows that the techniques have industrial strength.</p>
46

A symbolic approach for the verification and the test of service choreographies

Nguyễn, Huu Nghia (Hữu Nghĩa) 31 October 2013 (has links) (PDF)
Service-oriented engineering is an emerging software development paradigm for distributed collaborative applications. Such an application is made up of several entities abstracted as services, each of them being for example a Web application, a Web service, or even a human. The services can be developed independently and are composed to achieve common requirements through interactions among them. Service choreographies define such requirements from a global perspective, based on interactions among a set of participants. This thesis aims to formalize the problems and attempts to develop a framework by which service choreographies can be developed correctly for both top-down and bottom-up approaches. It consists in analyzing the relation between a choreography specification and a choreography implementation at both model level and real implementation level. Particularly, it concerns the composition/decomposition service design, the verification, and the testing of choreography implementation. The first key point of our framework is to support value-passing among services by using symbolic technique and SMT solver. It overcomes false negatives or state space explosion issues due by abstracting or limiting the data domain of value-passing in existing approaches. The second key point is the black-box passive testing of choreography implementation. It does not require neither to access to source codes nor to make the implementation unavailable during the testing process. Our framework is fully implemented in our toolchains, which can be downloaded or used online at address: http://schora.lri.fr.
47

A symbolic-based passive testing approach to detect vulnerabilities in networking systems

Mouttappa, Pramila 16 December 2013 (has links) (PDF)
Due to the increasing complexity of reactive systems, testing has become an important part in the process of the development of such systems. Conformance testing with formal methods refers to checking functional correctness, by means of testing, of a black-box system under test with respect to a formal system specification, i.e., a specification given in a language with a formal semantics. In this aspect, passive testing techniques are used when the implementation under test cannot be disturbed or the system interface is not provided. Passive testing techniques are based on the observation and verification of properties on the behavior of a system without interfering with its normal operation, it also helps to observe abnormal behavior in the implementation under test on the basis of observing any deviation from the predefined behavior. The main objective of this thesis is to present a new approach to perform passive testing based on the analysis of the control and data part of the system under test. During the last decades, many theories and tools have been developed to perform conformance testing. However, in these theories, the specifications or properties of reactive systems are often modeled by different variants of Labeled Transition Systems (LTS). However, these methodologies do not explicitly take into account the system's data, since the underlying model of LTS are not able to do that. Hence, it is mandatory to enumerate the values of the data before modeling the system. This often results in the state-space explosion problem. To overcome this limitation, we have studied a model called Input-Output Symbolic Transition Systems (IOSTS) which explicitly includes all the data of a reactive system. Many passive testing techniques consider only the control part of the system and neglect data, or are confronted with an overwhelming amount of data values to process. In our approach, we consider control and data parts by integrating the concepts of symbolic execution and we improve trace analysis by introducing trace slicing techniques. Properties are described using Input Output Symbolic Transition Systems (IOSTSs) and we illustrate in our approach how they can be tested on real execution traces optimizing the trace analysis. These properties can be designed to test the functional conformance of a protocol as well as security properties. In addition to the theoretical approach, we have developed a software tool that implements the algorithms presented in this paper. Finally, as a proof of concept of our approach and tool we have applied the techniques to two real-life case studies: the SIP and Bluetooth protocol
48

Verificação de conformidade entre diagramas de sequência UML e código Java. / Verification of compliance between UML and Java code sequence diagrams.

RABELO JÚNIOR, Sebastião Estefânio Pinto. 02 September 2018 (has links)
Submitted by Johnny Rodrigues (johnnyrodrigues@ufcg.edu.br) on 2018-09-02T14:02:19Z No. of bitstreams: 1 SEBASTIÃO ESTEFÂNIO PINTO RABELO JÚNIOR - DISSERTAÇÃO PPGCC 2011..pdf: 13091249 bytes, checksum: 1cb0178385eb3bd7c5eb2d8c16dd72ac (MD5) / Made available in DSpace on 2018-09-02T14:02:19Z (GMT). No. of bitstreams: 1 SEBASTIÃO ESTEFÂNIO PINTO RABELO JÚNIOR - DISSERTAÇÃO PPGCC 2011..pdf: 13091249 bytes, checksum: 1cb0178385eb3bd7c5eb2d8c16dd72ac (MD5) Previous issue date: 2012-11-11 / Capes / Atualmente, quando se fala em UML, temos os diagramas de sequência como o mais popular entre os diagramas usados para descrever aspectos comportamentais de um software. Por outro lado, temos Java como uma das linguagens orientadas a objetos mais usada no mundo. Entretanto, não encontramos em nossas pesquisas um meio sistêmico para a verificação automática de conformidade entre modelos comportamentais e o código desenvolvido para atender esse modelo. Nesta dissertação, nós desenvolvemos uma abordagem capaz de verificar- esse tipo de conformidade. O uso dessa abordagem permitirá ajudai- desenvolvedores, analistas, e gerentes de projeto a manter a documentação do software atualizada, além de possibilitar a existência de um novo ponto de vista a respeito de defeitos na implementação de um sistema. Para dar suporte a essa verificação de conformidade nós desenvolvemos uma ferramenta baseada em Model Driven Architecture (MDA) capaz de gerar os testes de conformidade aqui apresentados. Além disso, esta dissertação traz uma avaliação da abordagem desenvolvida, a qual apresenta os principais resultados obtidos. / Currently, sequence diagrams are the most popular UML diagrams used to describe behavioral aspects of software systems. On the other hand, Java as one of the most popular object-oriented language used in lhe world. Despite that. there is no systematic approach to support verification between the behavioral design and the implemented source code. In this work, we propose an approach to verify this conformity. The use of this approach vvill help developers, architects, and engineers to maintain the software documentation updated. Its usage allows that the development team and managers to detect behavioral design implementation defects. We also present the tool support built for our approach using Model Driven Architecture (MDA) and a preliminary evaluation about this work.
49

Checagem de conformidade arquitetural na modernização orientada a arquitetura

Chagas, Fernando Bezerra 03 March 2016 (has links)
Submitted by Alison Vanceto (alison-vanceto@hotmail.com) on 2017-01-06T12:32:32Z No. of bitstreams: 1 DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5) / Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2017-01-16T12:01:27Z (GMT) No. of bitstreams: 1 DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5) / Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2017-01-16T12:01:35Z (GMT) No. of bitstreams: 1 DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5) / Made available in DSpace on 2017-01-16T12:01:44Z (GMT). No. of bitstreams: 1 DissFBC.pdf: 2063843 bytes, checksum: 152295a2a8dcd2c521f4aad29a6fba78 (MD5) Previous issue date: 2016-03-03 / Não recebi financiamento / Architecture-Driven Modernization (ADM) is a model-based initiative for standardizing reengineering processes. Its most important meta-model is KDM (Knowledge Discovery Metamodel), which is a platform and language-independent ISO standard. A important step in an Architecture-Driven Modernization is the Architectural Conformance Checking (ACC), whose goal is to identify the violations between the Planned (PA) and Current Architectures (CA) of a system. Although there are ACC approaches that act on source-code or proprietary models, there is none for hystems represented as KDM. This absence hinders the dissemination of ADM and increases the interest for research that investigates the suitability of KDM in this context. Therefore, in this paper, we present ArchKDM, a KDMbased ACC approach that relies exclusively on the KDM meta-model for representing i) the legacy system under analysis; ii) the PA; iii) the CA; and iv) the violations between them. ArchKDM is composed of three tool-supported steps: 1) Specifying the Planned Architecture; 2) Extracting the Current Architecture; and 3) Performing the Checking. Our goal is to investigate the suitability of KDM as the main representation in all ACC steps as well as to deliver an ACC approach in the ADM context. We evaluated steps 2 and 3 of the approach using two real-world systems and the results showed no false positives and negatives. / Modernização Dirigida por Modelos (ADM) é uma iniciativa para a padronização dos processos de reengenharia. Dentre os metamodelos criados pela ADM, o mais importante é chamado de KDM (Metamodelo de Descoberta de Conhecimento), que é independente de plataforma e linguagem, além de ser padrão ISO. Uma importante etapa em uma Modernização Dirigida por Modelos é a Checagem de Conformidade Arquitetural (ACC), cujo objetivo é identificar violações entre as representações das arquiteturas planejada e atual de um sistema. Embora existam abordagens para ACC que atuam sobre código-fonte e modelos proprietários, não foram encontrados indícios desse tipo de abordagem para sistemas representados em KDM. Essa ausência de pesquisas na área dificulta a disseminação da ADM e aumenta o interesse em investigar a adequabilidade do KDM nesse contexto. Portanto, neste trabalho é apresentado o ArchKDM, uma abordagem para ACC baseado em KDM que depende exclusivamente do metamodelo KDM para representação i) do sistema legado a ser analisado; ii) da arquitetura planejada; iii) da arquitetura atual; e iv) das violações encontradas entre eles. ArchKDM é composta por três etapas: 1) Especificação da Arquitetura Planejada; 2) Extração da Arquitetura Atual; e 3) Checagem de Conformidade Arquitetural. O objetivo deste trabalho é investigar a adequabilidade do KDM como principal representação em todas as etapas da ACC, bem como fornecer uma abordagem para ACC no contexto da ADM. A abordagem foi avaliada utilizando dois sistemas reais e os resultados mostraram que não foram encontrados falsos positivos e negativos.
50

Test de conformité de contrôleurs logiques spécifiés en grafcet / Conformance test of logic controllers from Grafcet specification

Provost, Julien 08 July 2011 (has links)
Les travaux présentés dans ce mémoire de thèse s'intéressent à la génération et à la mise en œuvre de séquences de test pour le test de conformité de contrôleurs logiques. Dans le cadre de ces travaux, le Grafcet (IEC 60848 (2002)), langage de spécification graphique utilisé dans un contexte industriel, a été retenu comme modèle de spécification. Les contrôleurs logiques principalement considérés dans ces travaux sont les automates programmables industriels (API). Afin de valider la mise en œuvre du test de conformité pour des systèmes de contrôle/commande critiques, les travaux présentés proposent: - Une formalisation du langage de spécification Grafcet. En effet, l'application des méthodes usuelles de vérification et de validation nécessitent la connaissance du comportement à partir de modèles formels. Cependant, dans un contexte industriel, les modèles utilisés pour la description des spécifications fonctionnelles sont choisis en fonction de leur pouvoir d'expression et de leur facilité d'utilisation, mais ne disposent que rarement d'une sémantique formelle. - Une étude de la mise en œuvre de séquences de test et l'analyse des verdicts obtenus lors du changement simultané de plusieurs entrées logiques. Une campagne d'expérimentation a permis de quantifier, pour différentes configurations de l'implantation, le taux de verdicts erronés dus à ces changements simultanés. - Une définition du critère de SIC-testabilité d'une implantation. Ce critère, déterminé à partir de la spécification Grafcet, définit l'aptitude d'une implantation à être testée sans erreur de verdict. La génération automatique de séquences de test minimisant le risque de verdict erroné est ensuite étudiée. / The works presented in this PhD thesis deal with the generation and implementation of test sequences for conformance test of logic controllers. Within these works, Grafcet (IEC 60848 (2002)), graphical specification language used in industry, has been selected as the specification model. Logic controllers mainly considered in these works are Programmable Logic Controllers (PLC). In order to validate the carrying out of conformance test of critical control systems, this thesis presents: - A formalization of the Grafcet specification language. Indeed, to apply usual verification and validation methods, the behavior is required to be expressed through formal models. However, in industry, the models used to describe functional specifications are chosen for their expression power and usability, but these models rarely have a formal semantics. - A study of test sequences execution and analysis of obtained verdicts when several logical inputs are changed simultaneously. Series of experimentation have permitted to quantify, for different configurations of the implantation under test, the rate of erroneous verdicts due to these simultaneous changes. - A definition of the SIC-testability criterion for an implantation. This criterion, determined on the Grafect specification defines the ability of an implementation to be tested without any erroneous verdict. Automatic generation of test sequences that minimize the risk of erroneous verdict is then studied.

Page generated in 0.062 seconds