• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 8
  • 8
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Abstract satisfaction

Haller, Leopold Carl Robert January 2013 (has links)
This dissertation shows that satisfiability procedures are abstract interpreters. This insight provides a unified view of program analysis and satisfiability solving and enables technology transfer between the two fields. The framework underlying these developments provides systematic recipes that show how intuition from satisfiability solvers can be lifted to program analyzers, how approximation techniques from program analyzers can be integrated into satisfiability procedures and how program analyzers and satisfiability solvers can be combined. Based on this work, we have developed new tools for checking program correctness and for solving satisfiability of quantifier-free first-order formulas. These tools outperform existing approaches. We introduce abstract satisfaction, an algebraic framework for applying abstract interpre- tation to obtain sound, but potentially incomplete satisfiability procedures. The framework allows the operation of satisfiability procedures to be understood in terms of fixed point computations involving deduction and abduction transformers on lattices. It also enables satisfiability solving and program correctness to be viewed as the same algebraic problem. Using abstract satisfaction, we show that a number of satisfiability procedures can be understood as abstract interpreters, including Boolean constraint propagation, the dpll and cdcl algorithms, St ̊almarck’s procedure, the dpll(t) framework and solvers based on congruence closure and the Bellman-Ford algorithm. Our work leads to a novel understand- ing of satisfiability architectures as refinement procedures for abstract analyses and allows us to relate these procedures to independent developments in program analysis. We use this perspective to develop Abstract Conflict-Driven Clause Learning (acdcl), a rigorous, lattice-based generalization of cdcl, the central algorithm of modern satisfiability research. The acdcl framework provides a solution to the open problem of lifting cdcl to new prob- lem domains and can be instantiated over many lattices that occur in practice. We provide soundness and completeness arguments for acdcl that apply to all such instantiations. We evaluate the effectiveness of acdcl by investigating two practical instantiations: fp-acdcl, a satisfiability procedure for the first-order theory of floating point arithmetic, and cdfpl, an interval-based program analyzer that uses cdcl-style learning to improve the precision of a program analysis. fp-acdcl is faster than competing approaches in 80% of our benchmarks and it is faster by more than an order of magnitude in 60% of the benchmarks. Out of 33 safe programs, cdfpl proves 16 more programs correct than a mature interval analysis tool and can conclusively determine the presence of errors in 24 unsafe benchmarks. Compared to bounded model checking, cdfpl is on average at least 260 times faster on our benchmark set.
2

Abstraction discovery and refinement for model checking by symbolic trajectory evaluation

Adams, Sara Elisabeth January 2014 (has links)
This dissertation documents two contributions to automating the formal verification of hardware – particularly memory-intensive circuits – by Symbolic Trajectory Evaluation (STE), a model checking technique based on symbolic simulation over abstract sets of states. The contributions focus on improvements to the use of BDD-based STE, which uses binary decision diagrams internally. We introduce a solution to one of the major hurdles in using STE: finding suitable abstractions. Our work has produced the first known algorithm that addresses this problem by automatically discovering good, non-trivial abstractions. These abstractions are computed from the specification, and essentially encode partial input combinations sufficient for determining the specification’s output value. They can then be used to verify whether the hardware model meets its specification using a technique based on and significantly extending previous work by Melham and Jones [2]. Moreover, we prove that our algorithm delivers correct results by construction. We demonstrate that the abstractions received by our algorithm can greatly reduce verification costs with three example hardware designs, typical of the kind of problems faced by the semiconductor design industry. We further propose a refinement method for abstraction schemes when over- abstraction occurs, i.e., when the abstraction hides too much information of the original design to determine whether it meets its specification. The refinement algorithm we present is based on previous work by Chockler et al. [3], which selects refinement candidates by approximating which abstracted input is likely the biggest cause of the abstraction being unsuitable. We extend this work substantially, concentrating on three aspects. First, we suggest how the approach can also work for much more general abstraction schemes. This enables refining any abstraction allowed in STE, rather than just a subset. Second, Chockler et al. describe how to refine an abstraction once a refinement candidate has been identified. We present three additional variants of refining the abstraction. Third, the refinement at its core depends on evaluating circuit logic gates. The previous work offered solutions for NOT- and AND-gates. We propose a general approach to evaluating arbitrary logic gates, which improves the selection process of refinement candidates. We show the effectiveness of our work by automatically refining an abstraction for a content-addressable memory that exhibits over-abstraction, and by evaluating some common logic gates. These two contributions can be used independently to help automate the hard- ware verification by STE, but they also complement each other. To show this, we combine both algorithms to create a fully automatic abstraction discovery and refinement loop. The only inputs required are the hardware design and the specification, which the design should meet. While only small circuits could be verified completely automatically, it clearly shows that our two contributions allow the construction of a verification framework that does not require any user interaction.
3

Proposta de integração entre tecnologias adaptativas e algoritmos genéticos. / Proposal for integration of adaptive technology and genetic algorithms.

Lopes, Victor Dias 03 April 2009 (has links)
Este trabalho é um estudo inicial sobre a integração de duas áreas da engenharia da computação, as tecnologias adaptativas e os algoritmos genéticos. Para tanto, foi realizada a aplicação de algoritmos genéticos na inferência de autômatos adaptativos. Várias tácnicas foram estudas e propostas para a implementação do algoritmo, visando µa obtenção de resultados cada vez mais satisfatórios. Ambas as tecnologias, algoritmos genéticos e tecnologia adaptativa, possuem caráter fortemente adaptativo, porém com características bastante diferentes na forma que são implementadas e executadas. As inferências, propostas neste trabalho, foram realizadas com sucesso, de maneira que as técnicas descritas podem ser empregadas em ferramentas de auxílio para projetistas desses tipos de dispositivos. Ferramentas que podem vir a ser úteis devido µa complexidade envolvida no desenvolvimento de um autômato adaptativo. Através desta aplicação dos algoritmos genéticos, observando como os autômatos evoluíram durante a execução dos ensaios realizados, acredita-se que foi obtido um entendimento melhor da estrutura e funcionamento dos autômatos adaptativos e de como essas duas tecnologias, tão importantes, podem ser combinadas. / This work is an initial study about the integration of two computing engineering areas, the adaptive technologies and the genetic algorithms. For that, it was per- formed the application of genetic algorithms for the adaptive automata inference. Several techniques were studied and proposed along the algorithm implementation, always seeking for more satisfying results. Both technologies, genetic algorithm and adaptive technology, hold very strong adaptive features, however, with very di®erent characteristics in the way they are implemented and executed. The inferences, proposed in this work, were performed with success, so that the techniques described may be employed in aid tools for designers of such de- vices. Tools that may be useful due to the complexity involved in the development of an adaptive automaton. Through this genetic algorithm application, observing how automata evolved during the algorithm execution, we believe that it was obtained a better under- standing about the adaptive automaton structure and how those two, so impor- tant, technologies can be integrated.
4

Proposta de integração entre tecnologias adaptativas e algoritmos genéticos. / Proposal for integration of adaptive technology and genetic algorithms.

Victor Dias Lopes 03 April 2009 (has links)
Este trabalho é um estudo inicial sobre a integração de duas áreas da engenharia da computação, as tecnologias adaptativas e os algoritmos genéticos. Para tanto, foi realizada a aplicação de algoritmos genéticos na inferência de autômatos adaptativos. Várias tácnicas foram estudas e propostas para a implementação do algoritmo, visando µa obtenção de resultados cada vez mais satisfatórios. Ambas as tecnologias, algoritmos genéticos e tecnologia adaptativa, possuem caráter fortemente adaptativo, porém com características bastante diferentes na forma que são implementadas e executadas. As inferências, propostas neste trabalho, foram realizadas com sucesso, de maneira que as técnicas descritas podem ser empregadas em ferramentas de auxílio para projetistas desses tipos de dispositivos. Ferramentas que podem vir a ser úteis devido µa complexidade envolvida no desenvolvimento de um autômato adaptativo. Através desta aplicação dos algoritmos genéticos, observando como os autômatos evoluíram durante a execução dos ensaios realizados, acredita-se que foi obtido um entendimento melhor da estrutura e funcionamento dos autômatos adaptativos e de como essas duas tecnologias, tão importantes, podem ser combinadas. / This work is an initial study about the integration of two computing engineering areas, the adaptive technologies and the genetic algorithms. For that, it was per- formed the application of genetic algorithms for the adaptive automata inference. Several techniques were studied and proposed along the algorithm implementation, always seeking for more satisfying results. Both technologies, genetic algorithm and adaptive technology, hold very strong adaptive features, however, with very di®erent characteristics in the way they are implemented and executed. The inferences, proposed in this work, were performed with success, so that the techniques described may be employed in aid tools for designers of such de- vices. Tools that may be useful due to the complexity involved in the development of an adaptive automaton. Through this genetic algorithm application, observing how automata evolved during the algorithm execution, we believe that it was obtained a better under- standing about the adaptive automaton structure and how those two, so impor- tant, technologies can be integrated.
5

Knowledge building in software developer communities

Zagalsky, Alexey 07 September 2018 (has links)
Software development has become a cognitive and collaborative knowledge-based endeavor where developers and organizations, faced with a variety of challenges and an increased demand for extensive knowledge support, push the boundaries of existing tools and work practices. Researchers and industry professionals have spent years studying collaborative work and communication media, however, the landscape of social media is rapidly changing. Thus, instead of trying to model the use of specific technologies and communication media, I seek to model the knowledge-building process itself. Doing so will not only allow us to understand specific tool and communication media use, but whole ecosystems of technologies and their impact on software development and knowledge work, revealing aspects not only unique to specific tools, but also aspects about the combination of technologies. In this dissertation, I describe the empirical studies I conducted aimed to understand social and communication media use in software development and knowledge curation within developer communities. An important part of the thesis is an additional qualitative meta-synthesis of these studies. My meta-analysis has led to a model of software development as a knowledge building process, and a theoretical framework: I describe this newly formed framework and how it is grounded in empirical work, and demonstrate how my primary studies led to its creation. My conceptualization of knowledge building withing software development and the proposed framework provide the research community with the means to pursue a deeper understanding of software development and contemporary knowledge work. I believe that this framework can serve as a basis for a theory of knowledge building in software development, shedding light on knowledge flow, knowledge productivity, and knowledge management. / Graduate
6

Model Checking Systems with Replicated Components using CSP

Mazur, Tomasz Krzysztof January 2011 (has links)
The Parameterised Model Checking Problem asks whether an implementation Impl(t) satisfies a specification Spec(t) for all instantiations of parameter t. In general, t can determine numerous entities: the number of processes used in a network, the type of data, the capacities of buffers, etc. The main theme of this thesis is automation of uniform verification of a subclass of PMCP with the parameter of the first kind, using techniques based on counter abstraction. Counter abstraction works by counting how many, rather than which, node processes are in a given state: for nodes with k local states, an abstract state (c(1), ..., c(k)) models a global state where c(i) processes are in the i-th state. We then use a threshold function z to cap the values of each counter. If for some i, counter c(i) reaches its threshold, z(i) , then this is interpreted as there being z(i) or more nodes in the i-th state. The addition of thresholds makes abstract models independent of the instantiation of the parameter. We adapt standard counter abstraction techniques to concurrent reactive systems modelled using the CSP process algebra. We demonstrate how to produce abstract models of systems that do not use node identifiers (i.e. where all nodes are indistinguishable). Every such abstraction is, by construction, refined by all instantiations of the implementation. If the abstract model satisfies the specification, then a positive answer to the particular uniform verification problem can be deduced. We show that by adding node identifiers we make the uniform verification problem undecidable. We demonstrate a sound abstraction method that extends standard counter abstraction techniques to systems that make full use of node identifiers (in specifications and implementations). However, on its own, the method is not enough to give the answer to verification problems for all parameter instantiations. This issue has led us to the development of a type reduction theory, which, for a given verification problem, establishes a function phi that maps all (sufficiently large) instantiations T of the parameter to some fixed type T and allows us to deduce that if Spec(T) is refined by phi(Impl(T)), then Spec(T) is refined by Impl(T). We can then combine this with our extended counter abstraction techniques and conclude that if the abstract model satisfies Spec(T), then the answer to the uniform verification problem is positive. We develop a symbolic operational semantics for CSP processes that satisfy certain normality requirements and we provide a set of translation rules that allow us to concretise symbolic transition graphs. The type reduction theory relies heavily on these results. One of the main advantages of our symbolic operational semantics and the type reduction theory is their generality, which makes them applicable in other settings and allows the theory to be combined with abstraction methods other than those used in this thesis. Finally, we present TomCAT, a tool that automates the construction of counter abstraction models and we demonstrate how our results apply in practice.
7

Synthesis and alternating automata over real time

Jenkins, Mark Daniel January 2012 (has links)
Alternating timed automata are a powerful extension of classical Alur-Dill timed automata that are closed under all Boolean operations. They have played a key role, among others, in providing verification algorithms for prominent specification formalisms such as Metric Temporal Logic. Unfortunately, when interpreted over an infinite dense time domain (such as the reals), alternating timed automata have an undecidable language emptiness problem. In this thesis we consider restrictions on this model that restore the decidability of the language emptiness problem. We consider the restricted class of safety alternating timed automata, which can encode a corresponding Safety fragment of Metric Temporal Logic. This thesis connects these two formalisms with insertion channel machines, a model of faulty communication, and demonstrates that the three formalisms are interreducible. We thus prove a non-elementary lower bound for the language emptiness problem for 1-clock safety alternating timed automata and further obtain a new proof of decidability for this problem. Complementing the restriction to safety properties, we consider interpreting the automata over bounded dense time domains. We prove that the time-bounded language emptiness problem is decidable but non-elementary for unrestricted alternating timed automata. The language emptiness problem for alternating timed automata is a special case of a much more general and abstract logical problem: Church's synthesis problem. Given a logical specification S(I,O), Church's problem is to determine whether there exists an operator F that implements the specification in the sense that S(I,F(I)) holds for all inputs I. It is a classical result that the synthesis problem is decidable in the case that the specification and implementation are given in monadic second-order logic over the naturals. We prove that this decidability extends to MSO over the reals with order and furthermore to MSO over every fixed bounded interval of the reals with order and the +1 relation.
8

Verification of asynchronous concurrency and the shaped stack constraint

Kochems, Jonathan Antonius January 2014 (has links)
In this dissertation, we study the verification of concurrent programs written in the programming language Erlang using infinite-state model-checking. Erlang is a widely used, higher order, dynamically typed, call-by-value functional language with algebraic data types and pattern-matching. It is further augmented with support for actor concurrency, i.e. asynchronous message passing and dynamic process creation. With decidable model-checking in mind, we identify actor communicating systems (ACS) as a suitable target model for an abstract interpretation of Erlang. ACS model a dynamic network of finite-state processes that communicate over a fixed, finite number of unordered, unbounded channels. Thanks to being equivalent to Petri nets, ACS enjoy good algorithmic properties. We develop a verification procedure that extracts a sound abstract model, in the form of an ACS, from a given Erlang program; the resulting ACS simulates the operational semantics of the input. Using this abstract model, we can conservatively verify coverability properties of the input program, i.e. a weak form of safety properties, with a Petri net model-checker. We have implemented this procedure in our tool Soter, which is the first sound verification tool for Erlang programs using infinite-state model-checking. In our experiments, we find that Soter is accurate enough to verify a range of interesting and non-trivial benchmarks. Even though ACS coverability is Expspace-complete, Soter's analysis of these verification problems is surprisingly quick. In order to improve the precision of our verification procedure with respect to recursion, we investigate an extension of ACS that allows pushdown processes: asynchronously communicating pushdown systems (ACPS). ACPS that satisfy the empty-stack constraint (a pushdown process may receive only when its stack is empty) are a popular subclass of ACPS with good decision and complexity properties. In the context of Erlang, the empty stack constraint is unfortunately not realistic. We introduce a relaxation of the empty-stack constraint for ACPS called the shaped stack constraint. Stacks that fit the shape constraint may reach arbitrary heights. Further, a process may execute any communication action (be it process creation, message send or retrieval) whether or not its stack is empty. We prove that coverability for shaped ACPS, i.e. ACPS that satisfy the shaped constraint, reduces to the decidable coverability problem for well-structured transition systems (WSTS). Thus, shaped ACPS enable the modelling and verification of a larger class of message passing programs. We establish a close connection between shaped ACPS and a novel extension of Petri nets: nets with nested coloured tokens (NNCT). Tokens in NNCT are of two types: simple and complex. Complex tokens carry an arbitrary number of coloured tokens. The rules of a NNCT can synchronise complex and simple tokens, inject coloured tokens into a complex token, and eject all tokens of a specified set of active colours to predefined places. We show that the coverability problem for NNCT is Tower-complete, a new complexity class for non-elementary decision problems introduced by Schmitz. To prove Tower-membership, we devise a geometrically inspired version of the Rackoff technique, and we obtain Tower-hardness by adapting Stockmeyer's ruler construction to NNCT. To our knowledge, NNCT is the first extension of Petri nets (belonging to the class of nets with an infinite set of token types) that is proven to have primitive recursive coverability. This result implies Tower-completeness of coverability for ACPS that satisfy the shaped stack constraint.

Page generated in 0.051 seconds