• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • Tagged with
  • 14
  • 14
  • 6
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Moral knowledge - assessment of a perceptual paradigm

Sandøe, Peter January 1988 (has links)
No description available.
2

Exploiting structure for scalable software verification

Domagoj, Babić 11 1900 (has links)
Software bugs are expensive. Recent estimates by the US National Institute of Standards and Technology claim that the cost of software bugs to the US economy alone is approximately 60 billion USD annually. As society becomes increasingly software-dependent, bugs also reduce our productivity and threaten our safety and security. Decreasing these direct and indirect costs represents a significant research challenge as well as an opportunity for businesses. Automatic software bug-finding and verification tools have a potential to completely revolutionize the software engineering industry by improving reliability and decreasing development costs. Since software analysis is in general undecidable, automatic tools have to use various abstractions to make the analysis computationally tractable. Abstraction is a double-edged sword: coarse abstractions, in general, yield easier verification, but also less precise results. This thesis focuses on exploiting the structure of software for abstracting away irrelevant behavior. Programmers tend to organize code into objects and functions, which effectively represent natural abstraction boundaries. Humans use such structural abstractions to simplify their mental models of software and for constructing informal explanations of why a piece of code should work. A natural question to ask is: How can automatic bug-finding tools exploit the same natural abstractions? This thesis offers possible answers. More specifically, I present three novel ways to exploit structure at three different steps of the software analysis process. First, I show how symbolic execution can preserve the data-flow dependencies of the original code while constructing compact symbolic representations of programs. Second, I propose structural abstraction, which exploits the structure preserved by the symbolic execution. Structural abstraction solves a long-standing open problem --- scalable interprocedural path- and context-sensitive program analysis. Finally, I present an automatic tuning approach that exploits the fine-grained structural properties of software (namely, data- and control-dependency) for faster property checking. This novel approach resulted in a 500-fold speedup over the best previous techniques. Automatic tuning not only redefined the limits of automatic software analysis tools, but also has already found its way into other domains (like model checking), demonstrating the generality and applicability of this idea.
3

Exploiting structure for scalable software verification

Domagoj, Babić 11 1900 (has links)
Software bugs are expensive. Recent estimates by the US National Institute of Standards and Technology claim that the cost of software bugs to the US economy alone is approximately 60 billion USD annually. As society becomes increasingly software-dependent, bugs also reduce our productivity and threaten our safety and security. Decreasing these direct and indirect costs represents a significant research challenge as well as an opportunity for businesses. Automatic software bug-finding and verification tools have a potential to completely revolutionize the software engineering industry by improving reliability and decreasing development costs. Since software analysis is in general undecidable, automatic tools have to use various abstractions to make the analysis computationally tractable. Abstraction is a double-edged sword: coarse abstractions, in general, yield easier verification, but also less precise results. This thesis focuses on exploiting the structure of software for abstracting away irrelevant behavior. Programmers tend to organize code into objects and functions, which effectively represent natural abstraction boundaries. Humans use such structural abstractions to simplify their mental models of software and for constructing informal explanations of why a piece of code should work. A natural question to ask is: How can automatic bug-finding tools exploit the same natural abstractions? This thesis offers possible answers. More specifically, I present three novel ways to exploit structure at three different steps of the software analysis process. First, I show how symbolic execution can preserve the data-flow dependencies of the original code while constructing compact symbolic representations of programs. Second, I propose structural abstraction, which exploits the structure preserved by the symbolic execution. Structural abstraction solves a long-standing open problem --- scalable interprocedural path- and context-sensitive program analysis. Finally, I present an automatic tuning approach that exploits the fine-grained structural properties of software (namely, data- and control-dependency) for faster property checking. This novel approach resulted in a 500-fold speedup over the best previous techniques. Automatic tuning not only redefined the limits of automatic software analysis tools, but also has already found its way into other domains (like model checking), demonstrating the generality and applicability of this idea.
4

Exploiting structure for scalable software verification

Domagoj, Babić 11 1900 (has links)
Software bugs are expensive. Recent estimates by the US National Institute of Standards and Technology claim that the cost of software bugs to the US economy alone is approximately 60 billion USD annually. As society becomes increasingly software-dependent, bugs also reduce our productivity and threaten our safety and security. Decreasing these direct and indirect costs represents a significant research challenge as well as an opportunity for businesses. Automatic software bug-finding and verification tools have a potential to completely revolutionize the software engineering industry by improving reliability and decreasing development costs. Since software analysis is in general undecidable, automatic tools have to use various abstractions to make the analysis computationally tractable. Abstraction is a double-edged sword: coarse abstractions, in general, yield easier verification, but also less precise results. This thesis focuses on exploiting the structure of software for abstracting away irrelevant behavior. Programmers tend to organize code into objects and functions, which effectively represent natural abstraction boundaries. Humans use such structural abstractions to simplify their mental models of software and for constructing informal explanations of why a piece of code should work. A natural question to ask is: How can automatic bug-finding tools exploit the same natural abstractions? This thesis offers possible answers. More specifically, I present three novel ways to exploit structure at three different steps of the software analysis process. First, I show how symbolic execution can preserve the data-flow dependencies of the original code while constructing compact symbolic representations of programs. Second, I propose structural abstraction, which exploits the structure preserved by the symbolic execution. Structural abstraction solves a long-standing open problem --- scalable interprocedural path- and context-sensitive program analysis. Finally, I present an automatic tuning approach that exploits the fine-grained structural properties of software (namely, data- and control-dependency) for faster property checking. This novel approach resulted in a 500-fold speedup over the best previous techniques. Automatic tuning not only redefined the limits of automatic software analysis tools, but also has already found its way into other domains (like model checking), demonstrating the generality and applicability of this idea. / Science, Faculty of / Computer Science, Department of / Graduate
5

Schematic calculi for the analysis of decision procedures

Tushkanova, Elena 19 July 2013 (has links) (PDF)
In this thesis we address problems related to the verification of software-based systems. We aremostly interested in the (safe) design of decision procedures used in verification. In addition, we alsoconsider a modularity problem for a modeling language used in the Why verification platform.Many verification problems can be reduced to a satisfiability problem modulo theories (SMT). In orderto build satisfiability procedures Armando et al. have proposed in 2001 an approach based on rewriting.This approach uses a general calculus for equational reasoning named paramodulation. In general, afair and exhaustive application of the rules of paramodulation calculus (PC) leads to a semi-decisionprocedure that halts on unsatisfiable inputs (the empty clause is then generated) but may diverge onsatisfiable ones. Fortunately, it may also terminate for some theories of interest in verification, and thusit becomes a decision procedure. To reason on the paramodulation calculus, a schematic paramodulationcalculus (SPC) has been studied, notably to automatically prove decidability of single theories and oftheir combinations. The advantage of SPC is that if it halts for one given abstract input, then PC haltsfor all the corresponding concrete inputs. More generally, SPC is an automated tool to check propertiesof PC like termination, stable infiniteness and deduction completeness.A major contribution of this thesis is a prototyping environment for designing and verifying decisionprocedures. This environment, based on the theoretical studies, is the first implementation of theschematic paramodulation calculus. It has been implemented from scratch on the firm basis provided bythe Maude system based on rewriting logic. We show that this prototype is very useful to derive decidabilityand combinability of theories of practical interest in verification. It helps testing new saturationstrategies and experimenting new extensions of the original (schematic) paramodulation calculus.This environment has been applied for the design of a schematic paramodulation calculus dedicated tothe theory of Integer Offsets. This contribution is the first extension of the notion of schematic paramodulationto a built-in theory. This study has led to new automatic proof techniques that are different fromthose performed manually in the literature. The assumptions to apply our proof techniques are easyto satisfy for equational theories with counting operators. We illustrate our theoretical contribution ontheories representing extensions of classical data structures such as lists and records.We have also addressed the problem of modular specification of generic Java classes and methods.We propose extensions to the Krakatoa Modeling Language, a part of the Why platform for provingthat a Java or C program is a correct implementation of some specification. The key features arethe introduction of parametricity both for types and for theories and an instantiation relation betweentheories. The proposed extensions are illustrated on two significant examples: the specification of thegeneric method for sorting arrays and for generic hash map.Both problems considered in this thesis are related to SMT solvers. Firstly, decision procedures areat the core of SMT solvers. Secondly, the Why platform extracts verification conditions from a sourceprogram annotated by specifications, and then transmits them to SMT solvers or proof assistants to checkthe program correctness.
6

A Decision Procedure for the WSkS Logic / A Decision Procedure for the WSkS Logic

Fiedor, Tomáš January 2014 (has links)
Různé typy logik se často používají jako prostředky pro formální specifikaci systémů. Slabá monadická logika druhého řádu s k následníky (WSkS) je jednou z nich a byť má poměrně velkou vyjadřovací sílu, stále je rozhodnutelná. Ačkoliv složitost testování splnitelnosti WSkS formule není ani ve třídě ELEMENTARY, tak existují přístupy založené na deterministických automatech, implementované např. v nástroji MONA, které efektně řeší omezenou třídu praktických příkladů, nicméně nefungují pro jiné. Tato práce rozšiřuje třídu prakticky řešitelných příkladů, a to tak, že využívá nedávno vyvinutých technik pro efektní manipulaci s nedeterministickými automaty (jako je například testování universality jazyka pomocí přístupu založeného na antichainech) a navrhuje novou rozhodovací proceduru pro WSkS využívající právě nedeterministické automaty. Procedura je implementována a ve srovnání s nástrojem MONA dosahuje v některých případech řádově lepších výsledků.
7

Strengthening the heart of an SMT-solver : Design and implementation of efficient decision procedures

Iguernelala, Mohamed 10 June 2013 (has links) (PDF)
This thesis tackles the problem of automatically proving the validity of mathematical formulas generated by program verification tools. In particular, it focuses on Satisfiability Modulo Theories (SMT): a young research topic that has seen great advances during the last decade. The solvers of this family have various applications in hardware design, program verification, model checking, etc.SMT solvers offer a good compromise between expressiveness and efficiency. They rely on a tight cooperation between a SAT solver and a combination of decision procedures for specific theories, such as the free theory of equality with uninterpreted symbols, linear arithmetic over integers and rationals, or the theory of arrays.This thesis aims at improving the efficiency and the expressiveness of the Alt-Ergo SMT solver. For that, we designed a new decision procedure for the theory of linear integer arithmetic. This procedure is inspired by Fourier-Motzkin's method, but it uses a rational simplex to perform computations in practice. We have also designed a new combination framework, capable of reasoning in the union of the free theory of equality, the AC theory of associative and commutativesymbols, and an arbitrary signature-disjoint Shostak theory. This framework is a modular and non-intrusive extension of the ground AC completion procedure with the given Shostak theory. In addition, we have extended Alt-Ergo with existing decision procedures to integrate additional interesting theories, such as the theory of enumerated data types and the theory of arrays. Finally, we have explored preprocessing techniques for formulas simplification as well as the enhancement of Alt-Ergo's SAT solver.
8

Program analysis with interpolants

Weissenbacher, Georg January 2010 (has links)
This dissertation discusses novel techniques for interpolation-based software model checking, an approximate method which uses Craig interpolation to compute invariants of programs. Our work addresses two aspects of program analyses based on model checking: verification (the construction of correctness proofs for programs) and falsification (the detection of counterexamples that violate the specification). In Hoare's calculus, a proof of correctness comprises assertions which establish that a program adheres to its specification. The principal challenge is to derive appropriate assertions and loop invariants. Contemporary software verification tools use Craig interpolation (as opposed to traditional predicate transformers such as the weakest precondition) to derive approximate assertions. The performance of the model checker is contingent on the Craig interpolants computed. We present novel interpolation techniques which provide the following advantages over existing methods. Firstly, the resulting interpolants are sound with respect to the bit-level semantics of programs, which is an improvement over interpolation systems that use linear arithmetic over the reals to approximate bit-vector arithmetic and/or do not support bit-level operations. Secondly, our interpolation systems afford us a choice of interpolants and enable us to fine-tune their logical strength and structure. In contrast, existing procedures are limited to a single ad-hoc choice of an interpolant. Interpolation-based verification tools are typically forced to refine an initial approximation repeatedly in order to achieve the accuracy required to establish or refute the correctness of a program. The detection of a counterexample containing a repetitive construct may necessitate one refinement step (involving the computation of additional interpolants) for each iteration of the loop. We present a heuristic that aims to avoid the repeated and computationally expensive construction of interpolants, thus enabling the detection of deeply buried defects such as buffer overflows. Finally, we present an implementation of our techniques and evaluate them on a set of standardised device driver and buffer overflow benchmarks.
9

Schematic calculi for the analysis of decision procedures / Calculs schématiques pour l'analyse de procédures de décision

Tushkanova, Elena 19 July 2013 (has links)
Dans cette thèse, on étudie des problèmes liés à la vérification de systèmes (logiciels). On s’intéresseplus particulièrement à la conception sûre de procédures de décision utilisées en vérification. De plus, onconsidère également un problème de modularité pour un langage de modélisation utilisé dans la plateformede vérification Why.De nombreux problèmes de vérification peuvent se réduire à un problème de satisfaisabilité modulodes théories (SMT). Pour construire des procédures de satisfaisabilité, Armando et al. ont proposé en2001 une approche basée sur la réécriture. Cette approche utilise un calcul général pour le raisonnementéquationnel appelé paramodulation. En général, une application équitable et exhaustive des règles ducalcul de paramodulation (PC) conduit à une procédure de semi-décision qui termine sur les entréesinsatisfaisables (la clause vide est alors engendrée), mais qui peut diverger sur les entrées satisfaisables.Mais ce calcul peut aussi terminer pour des théories intéressantes en vérification, et devient ainsi uneprocédure de décision. Pour raisonner sur ce calcul, un calcul de paramodulation schématique (SPC)a été étudié, en particulier pour prouver automatiquement la décidabilité de théories particulières etde leurs combinaisons. L’avantage de ce calcul SPC est que s’il termine sur une seule entrée abstraite,alors PC termine pour toutes les entrées concrètes correspondantes. Plus généralement, SPC est unoutil automatique pour vérifier des propriétés de PC telles que la terminaison, la stable infinité et lacomplétude de déduction.Une contribution majeure de cette thèse est un environnement de prototypage pour la conception etla vérification de procédures de décision. Cet environnement, basé sur des fondements théoriques, estla première implantation du calcul de paramodulation schématique. Il a été complètement implanté surla base solide fournie par le système Maude mettant en oeuvre la logique de réécriture. Nous montronsque ce prototype est très utile pour dériver la décidabilité et la combinabilité de théories intéressantes enpratique pour la vérification.Cet environnement est appliqué à la conception d’un calcul de paramodulation schématique dédié àune arithmétique de comptage. Cette contribution est la première extension de la notion de paramodulationschématique à une théorie prédéfinie. Cette étude a conduit à de nouvelles techniques de preuveautomatique qui sont différentes de celles utilisées manuellement dans la littérature. Les hypothèses permettantd’appliquer nos techniques de preuves sont faciles à satisfaire pour les théories équationnellesavec opérateurs de comptage. Nous illustrons notre contribution théorique sur des théories représentantdes extensions de structures de données classiques comme les listes ou les enregistrements.Nous avons également contribué au problème de la spécification modulaire pour les classes et méthodesJava génériques. Nous proposons des extensions du language de modélisation Krakatoa, faisant partiede la plateforme Why qui permet de prouver qu’un programme C ou Java est correct par rapport à saspécification. Les caractéristiques essentielles de notre apport sont l’introduction de la paramétricité à lafois pour les types et les théories, ainsi qu’une relation d’instantiation entre les théories. Les extensionsproposées sont illustrées sur deux exemples significatifs: tri de tableaux et fonctions de hachage.Les deux problèmes traités dans cette thèse ont pour point commun les solveurs SMT. Les procéduresde décision sont les moteurs des solveurs SMT, et la plateforme Why engendre des conditions devérification dérivées d’un programme source annoté, qu’elle transmet aux solveurs SMT (ou assistants depreuve) pour vérifier la correction du programme.Mots-clés: / In this thesis we address problems related to the verification of software-based systems. We aremostly interested in the (safe) design of decision procedures used in verification. In addition, we alsoconsider a modularity problem for a modeling language used in the Why verification platform.Many verification problems can be reduced to a satisfiability problem modulo theories (SMT). In orderto build satisfiability procedures Armando et al. have proposed in 2001 an approach based on rewriting.This approach uses a general calculus for equational reasoning named paramodulation. In general, afair and exhaustive application of the rules of paramodulation calculus (PC) leads to a semi-decisionprocedure that halts on unsatisfiable inputs (the empty clause is then generated) but may diverge onsatisfiable ones. Fortunately, it may also terminate for some theories of interest in verification, and thusit becomes a decision procedure. To reason on the paramodulation calculus, a schematic paramodulationcalculus (SPC) has been studied, notably to automatically prove decidability of single theories and oftheir combinations. The advantage of SPC is that if it halts for one given abstract input, then PC haltsfor all the corresponding concrete inputs. More generally, SPC is an automated tool to check propertiesof PC like termination, stable infiniteness and deduction completeness.A major contribution of this thesis is a prototyping environment for designing and verifying decisionprocedures. This environment, based on the theoretical studies, is the first implementation of theschematic paramodulation calculus. It has been implemented from scratch on the firm basis provided bythe Maude system based on rewriting logic. We show that this prototype is very useful to derive decidabilityand combinability of theories of practical interest in verification. It helps testing new saturationstrategies and experimenting new extensions of the original (schematic) paramodulation calculus.This environment has been applied for the design of a schematic paramodulation calculus dedicated tothe theory of Integer Offsets. This contribution is the first extension of the notion of schematic paramodulationto a built-in theory. This study has led to new automatic proof techniques that are different fromthose performed manually in the literature. The assumptions to apply our proof techniques are easyto satisfy for equational theories with counting operators. We illustrate our theoretical contribution ontheories representing extensions of classical data structures such as lists and records.We have also addressed the problem of modular specification of generic Java classes and methods.We propose extensions to the Krakatoa Modeling Language, a part of the Why platform for provingthat a Java or C program is a correct implementation of some specification. The key features arethe introduction of parametricity both for types and for theories and an instantiation relation betweentheories. The proposed extensions are illustrated on two significant examples: the specification of thegeneric method for sorting arrays and for generic hash map.Both problems considered in this thesis are related to SMT solvers. Firstly, decision procedures areat the core of SMT solvers. Secondly, the Why platform extracts verification conditions from a sourceprogram annotated by specifications, and then transmits them to SMT solvers or proof assistants to checkthe program correctness.
10

Procédures de décision pour des logiques modales d'actions, de ressources et de concurrence / Decision procedures for modal logics of actions, resources and concurrency

Boudou, Joseph 15 September 2016 (has links)
Les concepts d'action et de ressource sont omniprésents en informatique. La caractéristique principale d'une action est de changer l'état actuel du système modélisé. Une action peut ainsi être l'exécution d'une instruction dans un programme, l'apprentissage d'un fait nouveau, l'acte concret d'un agent autonome, l'énoncé d'un mot ou encore une tâche planifiée. La caractéristique principale d'une ressource est de pouvoir être divisée, par exemple pour être partagée. Il peut s'agir des cases de la mémoire d'un ordinateur, d'un ensemble d'agents, des différent sens d'une expression, d'intervalles de temps ou de droits d'accès. Actions et ressources correspondent souvent aux dimensions temporelles et spatiales du système modélisé. C'est le cas par exemple de l'exécution d'une instruction sur une case de la mémoire ou d'un groupe d'agents qui coopèrent. Dans ces cas, il est possible de modéliser les actions parallèles comme étant des actions opérant sur des parties disjointes des ressources disponibles. Les logiques modales permettent de modéliser les concepts d'action et de ressource. La sémantique relationnelle d'une modalité unaire est une relation binaire permettant d'accéder à un nouvel état depuis l'état courant. Ainsi une modalité unaire correspond à une action. De même, la sémantique d'une modalité binaire est une relation ternaire permettant d'accéder à deux états. En considérant ces deux états comme des sous-états de l'état courant, une modalité binaire modélise la séparation de ressources. Dans cette thèse, nous étudions des logiques modales utilisées pour raisonner sur les actions, les ressources et la concurrence. Précisément, nous analysons la décidabilité et la complexité du problème de satisfaisabilité de ces logiques. Ces problèmes consistent à savoir si une formule donnée peut être vraie. Pour obtenir ces résultats de décidabilité et de complexité, nous proposons des procédures de décision. Ainsi, nous étudions les logiques modales avec des modalités binaires, utilisées notamment pour raisonner sur les ressources. Nous nous intéressons particulièrement à l'associativité. Alors qu'il est généralement souhaitable que la modalité binaire soit associative, puisque la séparation de ressources l'est, cette propriété rend la plupart des logiques indécidables. Nous proposons de contraindre la valuation des variables propositionnelles afin d'obtenir des logiques décidables ayant une modalité binaire associative. Mais la majeure partie de cette thèse est consacrée à des variantes de la logique dynamique propositionnelle (PDL). Cette logiques possède une infinité de modalités unaires structurée par des opérateurs comme la composition séquentielle, l'itération et le choix non déterministe. Nous étudions tout d'abord des variantes de PDL comparables aux logiques temporelle avec branchement. Nous montrons que les problèmes de satisfaisabilité de ces variantes ont la même complexité que ceux des logiques temporelles correspondantes. Nous étudions ensuite en détails des variantes de PDL ayant un opérateur de composition parallèle de programmes inspiré des logiques de ressources. Cet opérateur permet d'exprimer la séparation de ressources et une notion intéressante d'actions parallèle est obtenue par la combinaison des notions d'actions et de séparation. En particulier, il est possible de décrire dans ces logiques des situations de coopération dans lesquelles une action ne peut être exécutée que simultanément avec une autre. Enfin, la contribution principale de cette thèse est de montrer que, dans certains cas intéressants en pratique, le problème de satisfaisabilité de ces logiques a la même complexité que PDL. / The concepts of action and resource are ubiquitous in computer science. The main characteristic of an action is to change the current state of the modeled system. An action may be the execution of an instruction in a program, the learning of a new fact, a concrete act of an autonomous agent, a spoken word or a planned task. The main characteristic of resources is to be divisible, for instance in order to be shared. Resources may be memory cells in a computer, performing agents, different meanings of a phrase, time intervals or access rights. Together, actions and resources often constitute the temporal and spatial dimensions of a modeled system. Consider for instance the instructions of a computer executed at memory cells or a set of cooperating agents. We observe that in these cases, an interesting modeling of concurrency arises from the combination of actions and resources: concurrent actions are actions performed simultaneously on disjoint parts of the available resources. Modal logics have been successful in modeling both concepts of actions and resources. The relational semantics of a unary modality is a binary relation which allows to access another state from the current state. Hence, unary modalities are convenient to model actions. Similarly, the relational semantics of a binary modality is a ternary relation which allows to access two states from the current state. By interpreting these two states as substates of the current state, binary modalities allow to divide states. Hence, binary modalities are convenient to model resources. In this thesis, we study modal logics used to reason about actions, resources and concurrency. Specifically, we analyze the decidability and complexity of the satisfiability problem of these logics. These problems consist in deciding whether a given formula can be true in any model. We provide decision procedures to prove the decidability and state the complexity of these problems. Namely, we study modal logics with a binary modality used to reason about resources. We are particularly interested in the associativity property of the binary modality. This property is desirable since the separation of resources is usually associative too. But the associativity of a binary modality generally makes the logic undecidable. We propose in this thesis to constrain the valuation of propositional variables to make modal logics with an associative binary modality decidable. The main part of the thesis is devoted to the study of variants of the Propositional Dynamic Logic (PDL). These logics features an infinite set of unary modalities representing actions, structured by some operators like sequential composition, iteration and non-deterministic choice. We first study branching time variants of PDL and prove that the satisfiability problems of these logics have the same complexity as the corresponding branching-time temporal logics. Then we thoroughly study extensions of PDL with an operator for parallel composition of actions called separating parallel composition and based on the semantics of binary modalities. This operator allows to reason about resources, in addition to actions. Moreover, the combination of actions and resources provides a convenient expression of concurrency. In particular, these logics can express situations of cooperation where some actions can be executed only in parallel with some other actions. Finally, our main contribution is to prove that the complexity of the satisfiability problem of a practically useful variant of PDL with separating parallel composition is the same as the satisfiability problem of plain PDL.

Page generated in 0.0833 seconds