• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • 9
  • 8
  • Tagged with
  • 52
  • 52
  • 52
  • 45
  • 31
  • 28
  • 15
  • 15
  • 13
  • 11
  • 11
  • 11
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The algebra of open and interconnected systems

Fong, Brendan January 2016 (has links)
Herein we develop category-theoretic tools for understanding network-style diagrammatic languages. The archetypal network-style diagrammatic language is that of electric circuits; other examples include signal flow graphs, Markov processes, automata, Petri nets, chemical reaction networks, and so on. The key feature is that the language is comprised of a number of components with multiple (input/output) terminals, each possibly labelled with some type, that may then be connected together along these terminals to form a larger network. The components form hyperedges between labelled vertices, and so a diagram in this language forms a hypergraph. We formalise the compositional structure by introducing the notion of a hypergraph category. Network-style diagrammatic languages and their semantics thus form hypergraph categories, and semantic interpretation gives a hypergraph functor. The first part of this thesis develops the theory of hypergraph categories. In particular, we introduce the tools of decorated cospans and corelations. Decorated cospans allow straightforward construction of hypergraph categories from diagrammatic languages: the inputs, outputs, and their composition are modelled by the cospans, while the 'decorations' specify the components themselves. Not all hypergraph categories can be constructed, however, through decorated cospans. Decorated corelations are a more powerful version that permits construction of all hypergraph categories and hypergraph functors. These are often useful for constructing the semantic categories of diagrammatic languages and functors from diagrams to the semantics. To illustrate these principles, the second part of this thesis details applications to linear time-invariant dynamical systems and passive linear networks.
2

Models and theories of lambda calculus

Manzonetto, Giulio 18 February 2008 (has links) (PDF)
Dans cette thèse on s'intéresse surtout aux sémantiques principales du λ-calcul (c'est- a-dire la sémantique continue de Scott, la sémantique stable, et la sémantique fortement stable) mais on introduit et étudie aussi deux nouvelles sémantiques : la sémantique relationnelle et la sémantique indécomposable. Les modèles du λ-calcul pur peuvent être définis soit comme des objets réflexifs dans des catégories Cartésiennes fermées (modèles catégoriques) soit comme des algèbres combinatoires satisfaisant les cinq axiomes de Curry et l'axiome de Meyer-Scott ( λ-modèles). En ce qui concerne les modèles catégoriques, on montre que tout modèle catégorique peut être présenté comme un λ-modèle, même si la ccc (catégorie Cartésienne fermée) sous-jacente n'a pas assez de points, et on donne des conditions su santes pour qu'un modèle catégorique vivant dans une ccc \cpo-enriched" arbitraire ait H pour théorie équationnelle. On construit un modèle catégorique qui vit dans une ccc d'ensembles et relations (sémantique relationnelle) et qui satisfait ces conditions. De plus, on montre que le λ-modèle associe possède des propriétés algébriques qui le rendent apte a modéliser des extensions non-déterministes du -calcul. En ce qui concerne les algèbres combinatoires, on montre qu'elles satisfont une généralisation du Théorème de Représentation de Stone qui dit que toute algèbre combinatoire est isomorphe a un produit Booléen faible d'algèbres combinatoires directement indécomposables. On étudie la sémantique du λ-calcul dont les modèles sont directement indécomposable comme algèbres combinatoires (sémantique indécomposable); on prouve en particulier que cette sémantique est assez générale pour inclure d'une part les trois sémantiques principales et d'autre part les modèles de termes de toutes les λ-théories semi-sensibles. Par contre, on montre aussi qu'elle est largement incomplète. Finalement, on étudie la question de l'existence d'un modèle non-syntaxique du λ-calcul appartenant aux sémantiques principales et ayant une théorie équationnelle ou inéquationnelle r.e. (récursivement énumérable). Cette question est une généralisation naturelle du problème de Honsell et Ronchi Della Rocca (ouvert depuis plus que vingt ans) concernant l'existence d'un modèle continu de λβ ou λβη. On introduit une notion adéquate de modèles effectifs du λ-calcul, qui couvre en particulier tous les modèles qui ont été introduits individuellement en littérature, et on prouve que la théorie inéquationnelle d'un modèle effectif n'est jamais r.e. ; en conséquence sa théorie équationnelle ne peut pas être λβ ou λβη. On montre aussi que la théorie équationnelle d'un modèle effectif vivant dans la sémantique stable ou fortement stable n'est jamais r.e. En ce qui concerne la sémantique continue de Scott, on démontre que la théorie in équationnelle d'un modèle de graphe n'est jamais r.e. et qu'il existe beaucoup de modèles de graphes effectifs qui ont une théorie équationnelle qui n'est pas r.e.
3

Logics of Communication and Knowledge

Sietsma, Floor 12 December 2012 (has links) (PDF)
The goal of this dissertation is to give a logical representation of the knowledge dynamics that takes place during communication. I present a number of di erent logical frameworks for a number of di erent scenarios, ranging from an email conversation where all information that is sent is considered to be true, to a game of Liar's Dice where lying is expected of the players. In Chapter 3, I present a framework for modeling the knowledge of agents who exchange messages, using Dynamic Epistemic Logic. This framework uses Kripke models to represent the agents' knowledge in a static situation, and action models to update these Kripke models when the situation changes. Because the models are supposed to be nite, and all messages are represented explicitly in the model, the messages that are considered possible by the agents are limited to a nite set. This framework is useful in a situation in which there is a number of rounds in each of which a nite set of new messages becomes available to the agents. These messages can gradually be added to the model. The framework presented in Chapter 4 is of a more general nature. It models a setting where agents communicate with messages over a speci fic network in accordance to a certain protocol. This framework is very exible because the nature of communicative events and the observational power of the agents can be adapted to the situation at hand. It combines properties of the Dynamic Epistemic Logic approach with the perspective of Interpreted Systems. In Chapter 5 and 6 I focus on email communication speci cally. I rst study the existence of common knowledge in a group of agents who communicate via emails. Unlike the approach presented in Chapter 3, all possible emails are rep- resented in the model, which is therefore of in nite size. I prove a number of properties of nite states in this in nite model, and show that common knowledge of an email with BCC recipients is rare. Apart from common knowledge, I consider two new kinds of knowledge: potential and de nitive knowledge. These two types of knowledge make a distinction between the assumption that every agent immediately reads every email he receives, or that every agent has only read the emails he replied to or forwarded. I also present a method to do model checking, even though the model is of in nite size. Chapter 7 is a study of the properties of action models, which are used to model communicative events. I de ne a notion of action emulation that signi es when two canonical action models are equivalent. Because every action model has an equivalent canonical action model which can be computed, this gives a general method to determine action model equivalence. In Chapter 8 I move from knowledge to belief. I use the same Kripke models as for knowledge, only without the assumption that all relations are equivalence relations. I propose a di erent assumption, namely that the relations are linked. I also give a number of updates of these models that preserve this property, representing communicative events. Finally, Chapter 9 gives di erent perspectives on the issue of lying. It includes a complete logic of manipulative updating, which can be used to represent the e ects of lying in a group of agents. I also analyze a game of Liar's Dice and implement this scenario in the model checker DEMO. Furthermore, I show that in a game where lying is considered normal, a lie is no longer a lie: because the agents who hear the lie do not believe it, no false belief is created.
4

A Generic Approach for Automated Verification of Product Line Models

Mazo, Raul 24 November 2011 (has links) (PDF)
This thesis explores the subject of automatic verification of product line models. This approach is based on the hypothesis that to automatically verify product line models, they should first be transformed into a language that makes them computable. In this thesis, product line models are transformed into constraint (logic) programs, then verified against a typology of verification criteria. The typology enumerates, classifies and formalizes a collection of generic verification criteria, i.e. criteria that can be applied (with or without adaptation) to any product line formalism. The typology makes the distinction between two categories of criteria: criteria that deal with the formalism in which models are represented, and the formalism-independent criteria. To identify defects in the first category, the thesis proposes a conformance checking approach directly related with verification of the abstract syntactic aspects of a model. To identify defects in the second category, the thesis proposes a domain-specific verification approach. An optimal algorithm is specified and implemented in constraint logic program for each criterion in the typology. These can be used independently -or in combination- to verify individual product line models. The thesis offers to support the verification of multiple product line models using an integration approach. Besides, this thesis proposes a series of integration strategies that can be used before applying the verification as for individual models. The product line verification approach proposed in this thesis is generic in the sense that it can be reused for any kind of product line model that instantiates the generic meta model based on which it was developed. It is general in the sense that it supports the verification of a comprehensive collection of criteria defined in the typology. This approach was implemented in a prototype tool that supports the specification, transformation, integration, configuration, analysis and verification of product line models via constraints (logic) programming. A benchmark gathering a corpus of 54 product line models was developed, then used in a series of experiments. The experiments showed that (i) the implementation of the domain-specific verification approach is fast and scalable to product line models up-to 2000 artefacts; (ii) the implementation of the conformance checking approach is fast and scalable to product line models up-to 10000 artefacts; and (iii) both approaches are correct and useful for industrial-size models.
5

Développement modulaire de théories et gestion de l'espace de nom pour l'assistant de preuve Coq.

Soubiran, Elie 27 September 2010 (has links) (PDF)
Ce manuscrit de thèse présente les travaux menés sur le système de modules de l'assistant de Preuve Coq.
6

Complexité Implicite de Lambda-Calculs Concurrents

Madet, Antoine 06 December 2012 (has links) (PDF)
Contrôler la consommation en ressources des programmes informatiques est d'importance capitale, non seulement pour des raisons de performance, mais aussi pour des questions de sécurité quand par exemple certains systèmes mobiles ou embarqués disposent de quantités limitées de ressources. Dans cette thèse, nous développons des critères statiques pour contrôler la consommation en ressources de programmes concurrents d'ordre supérieur. Nous prenons comme point de départ le cadre des Logiques Light qui a été étudié afin de contrôler la complexité de programmes fonctionnels d'ordre supérieur au moyen de la correspondance preuves-programmes. La contribution de cette thèse est d'étendre ce cadre aux programmes concurrents d'ordre supérieur. Plus généralement, cette thèse s'inscrit dans le domaine de la complexité implicite qui cherche à caractériser des classes de complexité par des principes logiques ou des restrictions de langage. Les critères que nous proposons sont purement syntaxiques et sont développés graduellement afin de contrôler le temps de calcul des programmes de plus en plus finement: dans un premier temps nous montrons comment garantir la terminaison des programmes (temps fini), puis nous montrons comment garantir la terminaison des programmes en temps élémentaire, et enfin nous montrons comment garantir la terminaison des programmes en temps polynomial. Nous introduisons également des systèmes de types tels que les programmes bien typés terminent en temps borné et retournent des valeurs. Enfin, nous montrons que ces systèmes de types capturent des programmes concurrents intéressants qui itèrent des fonctions produisant des effets de bord sur des structures de données inductives. Dans la dernière partie, nous étudions une méthode sémantique alternative afin de contrôler la consommation en ressources de programmes impératifs d'ordre supérieur. Cette méthode est basée sur la réalisabilité quantitative de Dal Lago et Hofmann et permet d'obtenir plusieurs bornes de complexité de manière uniforme. Cette dernière partie est un travail en collaboration avec Aloïs Brunel.
7

Development and verification of probability logics and logical frameworks

Maksimovic, Petar 15 October 2013 (has links) (PDF)
The research for this thesis has followed two main paths: the one of probability logics and the other of type systems and logical frameworks, bringing them together through interactive theorem proving. With the development of computer technology and the need to capture real-world dynamics, situations, and problems, reasoning under uncertainty has become one of the more important research topics of today, and one of the tools for formalizing this kind of knowledge are probability logics. Given that probability logics, serving as decision-making or decision-support systems, often form a basis for expert systems that find their application in fields such as game theory or medicine, their correct functioning is of great importance, and formal verification of their properties would add an additional level of security to the design process. On the other hand, in the field of logical frameworks and interactive theorem proving, attention has been directed towards a more natural way of encoding formal systems where derivation rules are subject to side conditions which are either rather difficult or impossible to encode naively, in the Edinburgh Logical Framework \LF or any other type-theory based Logical Framework, due to their inherent limitations, or to the fact that the formal systems in question need to access the derivation context, or the structure of the derivation itself, or other structures and mechanisms not available at the object level. The first part of the thesis deals with the development and formal verification of probability logics. First, we introduce a Probability Logic with Conditional Operators - LPCP, its syntax, semantics, and a sound and strongly-complete axiomatic system, featuring an infinitary inference rule. We prove the obtained formalism decidable, and extend it so as to represent evidence, making it the first propositional axiomatization of reasoning about evidence. Next, we show how to encode probability logics LQI and LQnI in the Proof Assistant Coq. Both of these logics extend classical logic with modal-like probability operators, and both feature an infinitary inference rule. LQI allows iterations of probability operators, while LQnI does not. We proceed to formally verify their key properties - soundness, strong completeness, and non-compactness. In this way, we formally justify the use of probabilistic SAT-solvers for the checking of consistency-related questions. In the second part of the thesis, we present LFP - a Logical Framework with External Predicates, by introducing a mechanism for locking and unlocking types and terms into LF, allowing the use of external oracles. We prove that LFP satisfies all of the main meta-theoretical properties (strong normalization, confluence, subject reduction, decidability of type checking). We develop a corresponding canonical framework, allowing for easy proofs of encoding adequacy. We provide a number of encodings - the simple untyped lambda-calculus with a Call-by-Value reduction strategy, the Design-by-Contract paradigm, a small imperative language with Hoare Logic, Modal Logics in Hilbert and Natural Deduction style, and Non-Commutative Linear Logic (encoded for the first time in an LF-like framework), illustrating that in LFP we can encode side-conditions on the application of rules elegantly, and achieve a separation between derivation and computation, resulting in cleaner and more readable proofs. We believe that the results presented in this thesis can serve as a foundation for fruitful future research. On the one hand, the obtained formal correctness proofs add an additional level of security when it comes to the construction of expert systems constructed using the verified logics, and pave way for further formal verification of other probability logics. On the other hand, there is room for further improvement, extensions, and deeper analysis of the LFP framework, as well as the building of a prototype interactive theorem prover based on LFP and discovering its place in the world of proof assistants.
8

Développement et Vérification des Logiques Probabilistes et des Cadres Logiques

Maksimovic, Petar 15 October 2013 (has links) (PDF)
On présente une Logique Probabiliste avec des Operateurs Conditionnels - LPCP, sa syntaxe, sémantique, axiomatisation correcte et fortement complète, comprenant une règle de déduction infinitaire. On prouve que LPCP est décidable, et on l'étend pour qu'il puisse représenter l'évidence, en créant ainsi la première axiomatisation propositionnelle du raisonnement basé sur l'évidence. On codifie les Logiques Probabilistes LPP1Q et LPPQ2 dans l'Assistant de Preuve Coq, et on vérifie formellement leurs propriétés principales: correction, complétude fort et non-compacité. Les deux logiques étendent la Logique Classique avec des opérateurs de probabilité, et présentent une règle de déduction infinitaire. LPPQ1 permet des itérations des opérateurs de probabilité, lorsque LPPQ2 ne le permet pas. On a formellement justifié l'utilisation des solveurs SAT probabilistes pour vérifier les questions liées à la cohérence. On présente LFP, un Cadre Logique avec Prédicats Externes, en introduisant un mécanisme pour bloquer et débloquer types et termes dans LF, en permettant l'utilisation d'oracles externes. On démontre que LFP satisfait tous les principales propriétés et on développe un cadre canonique correspondant, qui permet de prouver l'adéquation. On fournit diverses encodages - le λ-calcul non-typé avec la stratégie de réduction CBV, Programmation-par-Contrats, un langage impératif avec la Logique de Hoare, des Logiques Modales et la Logique Linéaire Non-Commutative, en montrant que en LFP on peut codifier aisément des side-conditions dans l'application des règles de typage et atteindre une séparation entre vérification et computation, en obtenant des preuves plus claires et lisibles.
9

Optimizing and implementing repair programs for consistent query answering in databases /

Caniupǹ, Mn̤ica, January 1900 (has links)
Thesis (Ph.D.) - Carleton University, 2007. / Includes bibliographical references (p. 220-226). Also available in electronic format on the Internet.
10

PrologPF : parallel logic and functions on the Delphi machine

Lewis, Ian January 1998 (has links)
PrologPF is a parallelising compiler targeting a distributed system of general purpose workstations connected by a relatively low performance network. The source language extends standard Prolog with the integration of higher-order functions. The execution of a compiled PrologPF program proceeds in a similar manner to standard Prolog, but uses oracles in one of two modes. An oracle represents the sequence of clauses used to reach a given point in the problem search tree, and the same PrologPF executable can be used to build oracles, or follow oracles previously generated. The parallelisation strategy used by PrologPF proceeds in two phases, which this research shows can be interleaved. An initial phase searches the problem tree to a limited depth, recording the discovered incomplete paths. In the second phase these paths are allocated to the available processors in the network. Each processor follows its assigned paths and fully searches the referenced subtree, sending solutions back to a control processor. This research investigates the use of the technique with a one-time partitioning of the problem and no further scheduling communication, and with the recursive application of the partitioning technique to effect dynamic work reassignment. For a problem requiring all solutions to be found, execution completes when all the distributed processors have completed the search of their assigned subtrees. If one solution is required, the execution of all the path processors is terminated when the control processor receives the first solution. The presence of the extra-logical Prolog predicate cut in the user program conflicts with the use of oracles to represent valid open subtrees. PrologPF promotes the use of higher-order functional programming as an alternative to the use of cut. The combined language shows that functional support can be added as a consistent extension to standard Prolog.

Page generated in 0.1122 seconds