Spelling suggestions: "subject:"proof 1heory"" "subject:"proof btheory""
51 |
Jeux graphiques et théorie de la démonstration / Graphical games and proof theoryHatat, Florian 23 October 2013 (has links)
Ce travail est une contribution à la sémantique de jeux des langages de programmation. Il présente plusieurs méthodes nouvelles pour construire une sémantique de jeux pour un lambda-calcul de continuations.Si les sémantiques de jeux ont été développées à grande échelle pour fournir des modèles de langages fonctionnels avec références, en appel par nom et par valeur, ou pour différents fragments de la logique linéaire, certains de ses aspects demeurent cependant très subtils. Cette thèse s'intéresse spécifiquement à la notion d'innocence et à la combinatoire mise en jeu dans la composition des stratégies innocentes, en donnant pour chacune une interprétation via des constructions catégoriques standards.Nous reformulons la notion d'innocence en terme de préfaisceaux booléens sur une catégorie de vues. Pour cela, nous enrichissons la notion de parties dans notre sémantique de jeux en ajoutant des morphismes entre parties qui vont au-delà du simple ordre préfixe habituel. À partir d'une stratégie, donnée par les vues qu'elle accepte, on calcule son comportement sur toutes les parties en prenant une extension de Kan à droite.La composition des stratégies innocentes s'appuie sur les notions catégoriques habituelles de systèmes de factorisation et de foncteurs polynomiaux. Notre sémantique permet de modéliser l'interaction entre deux stratégies comme une seule stratégie dont il faut parvenir à cacher les coups internes, grâce à une technique d'élimination des coupures~: cette étape est accomplie avec une version affaiblie des systèmes de factorisation. La composition elle-même entre stratégies repose pour sa part sur l'utilisation de la théorie des foncteurs polynomiaux. Les propriétés essentielles, telles que l'associativité ou la correction de la sémantique, proviennent d'une méthode de preuve presque systématique donnée par cette théorie. / This work is a contribution to game semantics for programming languages. We describe new methods used to define a game semantics for a lambda-calculus with continuations.Game semantics have been widely used to provide models for functional programming languages with references, using call-by-name or call-by-value, or for different fragments of linear logic. Yet, some parts of these semantics are still highly subtle. This work mainly deals with the notion of innocence, and with combinatorics involved in composing innocent strategies. We provide both of them with an interpretation which relies on standard categorical constructions.We reformulate innocence in terms of boolean presheaves over a given category of views. We design for this purpose an enriched class of plays, by adding morphisms which do not appear in the traditional preorder of plays. We show how to compute the global behaviour, i.e., on every play, of a strategy given by its class of accepted views by taking a right Kan extension.Our composition of innocent strategies relies on the usual categorial notions of factorisation systems and polynomial functors. In our semantics, the interaction between two strategies is itself a strategy, in which we must hide internal moves with a cut-elimination process. This step is given by a weakened version of factorisations systems. The core of composition of strategies involves material borrowed from polynomial functors theory. This theory yields a systematic proof method for showing essential properties, such as associativity of composition, or correction of our semantics.
|
52 |
Transformations for proof-graphs with cycle treatment augmented via geometric perspective techniquesVaz Alves, Gleifer 31 January 2009 (has links)
Made available in DSpace on 2014-06-12T15:49:50Z (GMT). No. of bitstreams: 1
license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5)
Previous issue date: 2009 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / O presente trabalho é baseada em dois aspectos fundamentais: (i) o estudo de procedimentos
de normalização para sistemas de provas, especialmente para a lógica clássica com dedução
natural; e (ii) a investigação de técnicas da perspectiva geométrica aplicadas em propriedades
da teoria da prova. Com isso, a motivação específica deste trabalho reside principalmente na
análise daqueles trabalhos que estão voltados à definição de técnicas da normalização através
de mecanismos da perspectiva geométrica. Destaca-se que técnicas da perspectiva geométrica
trazem o uso de arcabouços gráficos e/ou topológicos com a finalidade de representar sistemas
formais de provas e suas propriedades. Dessa forma, a primeira parte do documento apresenta
o uso de técnicas e arcabouços topológicos para estabelecer algumas propriedades, como, por
exemplo, o critério de corretude e a normalização de sistemas de prova. Ao passo que a segunda
parte do documento é inicialmente direcionada à descrição de algumas abordagens de
normalização (principalmente) para a lógica clássica com dedução natural. E o complemento
da segunda parte é dedicado à definição do principal objetivo do trabalho, i.e., desenvolver um
procedimento de normalização para o conjunto completo de operadores dos N-Grafos, através
do auxílio de algumas técnicas de perspectiva geométrica. (Destaca-se que as técnicas de perspectiva
geométrica, aplicadas à normalização dos N-Grafos, não fazem uso de arcabouços
topológicos). N-Grafos é um sistema de prova com múltipla conclusão definido para lógica
clássica proposicional com dedução natural. Ademais, os N-Grafos possuem tanto regras lógicas
como estruturais, estruturas cíclicas são permitidas e além disso as derivações são representadas
como grafos direcionados. De fato, a princpal característica do procedimento de
normalização aqui apresentado é fornecer um tratamento completo para as estruturas cíclicas.
Ou seja, são definidas classes de ciclos válidos, critério de corretude, propriedades e ainda
um algoritmo específico para normalizar os ciclos nos N-Grafos. Destaca-se que esses elementos
são construídos através do auxílio de arcabouços gráficos. Além disso, o mecanismo
de normalização é capaz de lidar com os diferentes papéis executados pelos operadores ?/>.
Adicionalmente, apresenta-se uma prova direta da normalização fraca para os N-Grafos, bem
como, a determinação das propriedades da subfórmula e da separação
|
53 |
Vérification de typage pour le lambda-Pi-Calcul Modulo : théorie et pratique / Typechecking in the lambda-Pi-Calculus Modulo : Theory and PracticeSaillard, Ronan 25 September 2015 (has links)
La vérification automatique de preuves consiste à faire vérifier par un ordinateur la validité de démonstrations d'énoncés mathématiques. Cette vérification étant purement calculatoire, elle offre un haut degré de confiance. Elle est donc particulièrement utile pour vérifier qu'un logiciel critique, c'est-à-dire dont le bon fonctionnement a un impact important sur la sécurité ou la vie des personnes, des entreprises ou des biens, correspond exactement à sa spécification. DEDUKTI est l'un de ces vérificateurs de preuves. Il implémente un système de type, le lambda-Pi-Calcul Modulo, qui est une extension du lambda-calcul avec types dépendants avec des règles de réécriture du premier ordre. Suivant la correspondance de Curry-Howard, DEDUKTI implémente à la fois un puissant langage de programmation et un système logique très expressif. Par ailleurs, ce langage est particulièrement bien adapté à l'encodage d'autres systèmes logiques. On peut, par exemple, importer dans DEDUKTI des théorèmes prouvés en utilisant d'autres outils tels que COQ, HOL ou encore ZENON, ouvrant ainsi la voie à l'interopérabilité entre tous ces systèmes. Le lambda-Pi-Calcul Modulo est un langage très expressif. En contrepartie, certaines propriétés fondamentales du système, telles que l'unicité des types ou la stabilité du typage par réduction, ne sont pas garanties dans le cas général et dépendent des règles de réécriture considérées. Or ces propriétés sont nécessaires pour garantir la cohérence des systèmes de preuve utilisés, mais aussi pour prouver la correction et la complétude des algorithmes de vérification de types implémentés par DEDUKTI. Malheureusement, ces propriétés sont indécidables. Dans cette thèse, nous avons donc cherché à concevoir des critères garantissant la stabilité du typage par réduction et l'unicité des types et qui soient décidables, de manière à pouvoir être implémentés par DEDUKTI. Pour cela, nous donnons une nouvelle définition du lambda-Pi-Calcul Modulo qui rend compte de l'aspect itératif de l'ajout des règles de réécriture dans le système en les explicitant dans le contexte. Une étude détaillée de ce nouveau calcul permet de comprendre qu'on peut ramener le problème de la stabilité du typage par réduction et de l'unicité des types à deux propriétés plus simples, qui sont la compatibilité du produit et le bon typage des règles de réécriture. Nous étudions donc ces deux propriétés séparément et en donnons des conditions suffisantes effectives. Ces idées ont été implémentées dans DEDUKTI, permettant d'augmenter grandement sa généralité et sa fiabilité. / Automatic proof checking is about using a computer to check the validity of proofs of mathematical statements. Since this verification is purely computational, it offers a high degree of confidence. Therefore, it is particularly useful for checking that a critical software, i.e., a software that when malfunctioning may result in death or serious injury to people, loss or severe damage to equipment or environmental harm, corresponds to its specification. DEDUKTI is such a proof checker. It implements a type system, the lambda-Pi-Calculus Modulo, that is an extension of the dependently-typed lambda-calculus with first-order rewrite rules. Through the Curry-Howard correspondence, DEDUKTI implements both a powerful programming language and an expressive logical system. Furthermore, this language is particularly well suited for encoding other proof systems. For instance, we can import in DEDUKTI theorems proved using other tools such as COQ, HOL or ZENON, a first step towards creating interoperability between these systems.The lambda-Pi-Calculus Modulo is a very expressive language. On the other hand, some fundamental properties such as subject reduction (i.e., the stability of typing by reduction) and uniqueness of types are not guaranteed in general and depend on the rewrite rules considered. Yet, these properties are necessary for guaranteeing the coherence of the proof system, but also for provingthe soundness and completeness of the type-checking algorithms implemented in DEDUKTI. Unfortunately, these properties are undecidable. In this thesis, we design new criteria for subject reduction and uniqueness of types that are decidable in order to be implemented in DEDUKTI.For this purpose, we give a new definition of the lambda-Pi-Calculus Modulo that takes into account the iterative aspect of the addition of rewrite rules in the typing context. A detailed study of this new system shows that the problems of subject reduction and uniqueness of types can be reduced to two simpler properties that we call product compatibility and well-typedness of rewrite rules.Hence, we study these two properties separately and give effective sufficient conditions for them to hold.These ideas have been implemented in DEDUKTI, increasing its generality and reliability.
|
54 |
[en] AN EXPERIMENTAL APPROACH ON MINIMAL IMPLICATIONAL NATURAL DEDUCTION PROOFS COMPRESSION / [pt] UMA ABORDAGEM EXPERIMENTAL SOBRE A COMPRESSÃO DE PROVAS EM DEDUÇÃO NATURAL MINIMAL IMPLICACIONALJOSE FLAVIO CAVALCANTE BARROS JUNIOR 26 March 2020 (has links)
[pt] O tamanho das provas formais possui algumas importantes implicações
teóricas na área da complexidade computacional. O problema de determinar
se uma fórmula é uma tautologia da Lógica Proposicional Intuicionista e do
fragmento puramente implicacional da Lógica Minimal (M(contém)) é PSPACE-
Completo. Qualquer lógica proposicional com um sistema de dedução
natural que satisfaça o princípio da subfórmula possui o problema de
determinar tautologias em PSPACE. Saber se qualquer tautologia em M(contém)
admite provas de tamanho polinomialmente limitado está relacionado com
saber se NP = PSPACE. Técnicas de compressão de provas reportadas
na literatura utilizam duas abordagens principais para comprimir provas:
gerar provas já compactadas; comprimir uma prova já gerada. Proposta
por Gordeev e Haeusler (6), a Compressão Horizontal é uma técnica
de compressão de provas em dedução natural da M(contém) que utiliza grafos
direcionados para representar as provas. Dada a prova de uma tautologia
qualquer da M(contém), que pode possuir tamanho exponencial em relação ao
tamanho da conclusão, o objetivo da Compressão Horizontal é que a prova
resultante da compressão possua tamanho polinomialmente limitado em
relação ao tamanho da conclusão. Nosso trabalho apresenta a primeira
implementação da Compressão Horizontal, juntamente com os primeiros
resultados da aplicação da técnica sobre provas de tautologias da M(contém), além
disso, compara as taxas de compressão obtidas com técnicas tradicionais de
compressão de dados. / [en] The size of formal proofs has some important theoretical implications
in computational complexity theory. The problem of determining if some
formula of Intuitionistic Propositional Logic and the purely implicational
fragment of Minimal Logic (M(contains)) is a tautology is PSPACE-Complete. Any
propositional logic with a natural deduction system that satisfies the sub-
formula principle is PSPACE. To know if any tautology in M(contains) admits
polynomially sized proof is related to whether NP = PSPACE. Proof
compression techniques reported in literature use two main approaches
to proof compressing: generating already compressed proofs; compressing
an already generated proof. Proposed by Gordeev and Haeusler (6), the
Horizontal Compression is a natural deduction proof compression technique
that uses directed graphs to represent proofs. Given a tautology proof in
M(contains), which may have an exponential size in relation to conclusion length,
the goal of Horizontal Compression is that the compressed proof has a
polynomially limited size in relation to conclusion length. Our work presents
the first implementation of Horizontal Compression, together with the first
results of the execution of the technique on proofs of M(contains) tautologies.
|
55 |
Impact of exploration in a dynamic geometry environment on students' concept of proofLee, Man-sang, Arthur., 李文生. January 1996 (has links)
published_or_final_version / Education / Master / Master of Education
|
56 |
Graphical representation of canonical proof : two case studiesHeijltjes, Willem Bernard January 2012 (has links)
An interesting problem in proof theory is to find representations of proof that do not distinguish between proofs that are ‘morally’ the same. For many logics, the presentation of proofs in a traditional formalism, such as Gentzen’s sequent calculus, introduces artificial syntactic structure called ‘bureaucracy’; e.g., an arbitrary ordering of freely permutable inferences. A proof system that is free of bureaucracy is called canonical for a logic. In this dissertation two canonical proof systems are presented, for two logics: a notion of proof nets for additive linear logic with units, and ‘classical proof forests’, a graphical formalism for first-order classical logic. Additive linear logic (or sum–product logic) is the fragment of linear logic consisting of linear implication between formulae constructed only from atomic formulae and the additive connectives and units. Up to an equational theory over proofs, the logic describes categories in which finite products and coproducts occur freely. A notion of proof nets for additive linear logic is presented, providing canonical graphical representations of the categorical morphisms and constituting a tractable decision procedure for this equational theory. From existing proof nets for additive linear logic without units by Hughes and Van Glabbeek (modified to include the units naively), canonical proof nets are obtained by a simple graph rewriting algorithm called saturation. Main technical contributions are the substantial correctness proof of the saturation algorithm, and a correctness criterion for saturated nets. Classical proof forests are a canonical, graphical proof formalism for first-order classical logic. Related to Herbrand’s Theorem and backtracking games in the style of Coquand, the forests assign witnessing information to quantifiers in a structurally minimal way, reducing a first-order sentence to a decidable propositional one. A similar formalism ‘expansion tree proofs’ was presented by Miller, but not given a method of composition. The present treatment adds a notion of cut, and investigates the possibility of composing forests via cut-elimination. Cut-reduction steps take the form of a rewrite relation that arises from the structure of the forests in a natural way. Yet reductions are intricate, and initially not well-behaved: from perfectly ordinary cuts, reduction may reach unnaturally configured cuts that may not be reduced. Cutelimination is shown using a modified version of the rewrite relation, inspired by the game-theoretic interpretation of the forests, for which weak normalisation is shown, and strong normalisation is conjectured. In addition, by a more intricate argument, weak normalisation is also shown for the original reduction relation.
|
57 |
Čtyřhodnotová sémantika klasické a intuicionistické logiky / A Four-Valued Kripke Semantics for Classical and Intuitionistic LogicPřenosil, Adam January 2013 (has links)
The thesis introduces a logic which combines intuitionistic implication with de Morgan negation in a way which conservatively extends both classical and intuitionistic logic. This logic is the intuitionistic counterpart of the four-valued Belnap-Dunn logic. In relation to this logic, we study de Morgan algebras and their expansions, in particular their expansion with a constant representing inconsistency. We prove a duality for such algebras extending the Priestley duality. We also introduce a weak notion of modal algebra and prove a duality for such algebras. We then define analytic sequent calculi for various logics of de Morgan negation. Powered by TCPDF (www.tcpdf.org)
|
58 |
[en] MULTIPLE SUCCEDENT SEQUENT CALCULUS FOR INTUITIONISTIC FIRST-ORDER LOGIC / [pt] CÁLCULO DE SEQÜENTES DE SUCEDENTE MÚLTIPLO PARA LÓGICA INTUICIONISTA DE PRIMEIRA ORDEMMARIA FERNANDA PALLARES COLOMAR 08 January 2008 (has links)
[pt] A primeira apresentação de um Cálculo de Seqüentes foi
feita por Gerhard Gentzen na década de 1930. Neste tipo de
sistema, a diferença entre as versões clássica e
intuicionista radicardinalidade do sucedente.
O sucedente múltiplo foi tradicionalmente considerado como
o elemento
que representava o aspecto clássico do sistema, enquanto
os seqüentes intuicionistas podiam ter, no máximo, uma
fórmula no sucedente. Nas décadas
seguintes foram formulados diversos cálculos
intuicionistas de sucedente
múltiplo que atenuaram essa restrição forte na
cardinalidade do sucedente.
Na década de 1990, estudou-se a relação de conexão ou
dependência entre
as fórmulas visando assegurar o caráter intuicionista dos
sistemas. Nós realizamos uma revisão dos sistemas de se
seqüentes intuicionistas e algumas das
suas aplicações. Apresentamos a versão do sistema FIL
(feita para o caso
proposicional por De Paiva e Pereira) para a lógica
intuicionista de primeira
ordem provando que o mesmo é correto, completo e satisfaz
eliminação de
corte. / [en] The first Sequent Calculus was presented by Gerhard
Gentzen in
the thirties. In this system, the difference between
intuitionistic logic and
classical logic is based on the cardinality of the
succedent. Traditionally,
the multiple succedent was considered the element that
preserved the
classical aspect of the system, while intuitionistic
sequents have, at most,
one formula in the succedent. In the following decades,
several multiple
succedent intuitionistic calculus were presented that
diminish shed this st strong
restriction in the cardinality of the su succedent. In the
decade of 1990, this
cardinality restriction was replaced by a dependency
relation between the
formula ocurrences in the antecedent and in the succedent
of a sequent in
order to ensure the intuitionistic character of the
system. We make a revision
of the intuitionistic systems and some of their
applications. We present a
version of the system FIL (accomplish shed for the
propositional case by De
Paiva and Pereira) for first-order logic proving that it
is sound, complete
and that it satisfies the cut-elimination theorem.
|
59 |
Investigações em semânticas construtivas / Investigations on proof-theoretic semanticsOliveira, Hermogenes Hebert Pereira 14 February 2014 (has links)
Submitted by Cássia Santos (cassia.bcufg@gmail.com) on 2014-09-19T13:14:21Z
No. of bitstreams: 2
Dissertacao Hermogenes Hebert Pereira Oliveira.pdf: 452221 bytes, checksum: b2469cc663d70c03f4dcf9dbea202fb2 (MD5)
license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2014-09-19T13:19:45Z (GMT) No. of bitstreams: 2
Dissertacao Hermogenes Hebert Pereira Oliveira.pdf: 452221 bytes, checksum: b2469cc663d70c03f4dcf9dbea202fb2 (MD5)
license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5) / Made available in DSpace on 2014-09-19T13:19:45Z (GMT). No. of bitstreams: 2
Dissertacao Hermogenes Hebert Pereira Oliveira.pdf: 452221 bytes, checksum: b2469cc663d70c03f4dcf9dbea202fb2 (MD5)
license_rdf: 23148 bytes, checksum: 9da0b6dfac957114c6a7714714b86306 (MD5)
Previous issue date: 2014-02-14 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / Proof-theoretic Semantics provides a new approach to the semantics of logical
constants. It has compelling philosophical motivations which are rooted deeply
in the philosophy of language and the philosophy of mathematics. We investigate
this new approach of logical semantics and its perspective on logical validity in the
light of its own philosophical aspirations, especially as represented by the work
of Dummett (1991). Among our findings, we single out the validity of Peirce’s
rule with respect to a justification procedure based on the introduction rules for
the propositional logical constants. This is an undesirable outcome since Peirce’s
rule is not considered to be constructively acceptable. On the other hand, we also
establish the invalidity of the same inference rule with respect to a justification
procedure based on the elimination rules for the propositional logical constants.
We comment on the implications of this scenario to Dummett’s philosophical programme
and to proof-theoretic semantics in general. / As semânticas construtivas oferecem uma nova abordagem semântica para as
constantes lógicas. Essas semânticas gozam de fortes motivações filosóficas advindas
da filosofia da linguagem e da filosofia da matemática. Nós investigamos
essa nova abordagem semântica da lógica e sua concepção de validade lógica sob
a luz de suas próprias aspirações filosóficas, em especial aquelas representadas
pelo trabalho de Dummett (1991). Dentre nossos resultados, destacamos a validade
da Regra de Peirce em relação ao procedimento justificatório baseado nas
regras de introdução para as constantes lógicas proposicionais. Essa é uma situação
indesejável, pois a Regra de Peirce não é considerada aceitável de um ponto
de vista construtivo. Por outro lado, verificamos que o procedimento justificatório
baseado nas regras de eliminação atesta a invalidade dessa mesma regra. Tecemos
alguns comentários a respeito das consequências desse cenário para o projeto
filosófico de Dummett e para as semânticas construtivas em geral.
|
60 |
Réalisabilité classique : nouveaux outils et applications / Classical realizability : new tools and applicationsGeoffroy, Guillaume 29 March 2019 (has links)
La réalisabilité classique de Jean-Louis Krivine associe à chaque modèle de calcul et chaque modèle de la théorie des ensembles un nouveau modèle de la théorie des ensembles, appelé modèle de réalisabilité, d'une façon similaire au forcing. Chaque modèle de réalisabilité est muni d’une algèbre de Boole caractéristique $\gimel 2$ (gimel 2), dont la structure donne des informations sur les propriétés du modèle de réalisabilité. En particulier, les modèles de forcing correspondent au cas où $\gimel 2$ est l'algèbre de Boole à deux éléments.Ce travail présente de nouveaux outils pour manipuler les modèles de réalisabilité et donne de nouveaux résultats obtenus en les exploitant. L'un d'entre eux est qu'au premier ordre, la théorie des algèbres de Boole à au moins deux éléments est complète pour $\gimel 2$, au sens où $\gimel 2$ eut être rendue élémentairement équivalente à n'importe quelle algèbre de Boole. Deux autres résultats montrent que $\gimel 2$ peut être utilisée pour étudier les modèles dénotationnels de langage de programmation (chacun part d'un modèle dénotationnel et classifie ses degrés de parallélisme à l'aide de $\gimel 2$). Un autre résultat montre que la technique de Jean-Louis Krivine pour réaliser l'axiome des choix dépendants à partir de l'instruction quote peut se généraliser à des formes plus fortes de choix. Enfin, un dernier résultat, obtenu en collaboration avec Laura Fontanella, accompagne le précédent en adaptant la condition d'antichaîne dénombrable du forcing au cadre de la réalisabilité, ce qui semble semble ouvrir une piste prometteuse pour réaliser l'axiome du choix. / Jean-Louis Krivine's classical realizability defines, from any given model of computation and any given model of set theory, a new model of set theory called the realizability model, in a similar way to forcing. Each realizability model is equipped with a characteristic Boolean algebra $\gimel 2$ (gimel 2), whose structure encodes important information about the properties of the realizability model. For instance, forcing models are precisely the realizability models in which $\gimel 2$ is the Boolean algebra with to elements.This document defines new tools for studying realizability models and exploits them to derive new results. One such result is that, as far as first-order logic is concerned, the theory of Boolean algebras with at least two elements is complete for $\gimel 2$, meaning that for each Boolean algebra B (with at least two elements), there exists a realizability model in which $\gimel 2$ is elementarily equivalent to B. Next, two results show that $\gimel 2$ can be used as a tool to study denotational models of programming languages (each one of them takes a particular denotational model and classifies its degrees of parallelism using $\gimel 2$). Moving to set theory, another results generalizes Jean-Louis Krivine's technique of realizing the axiom of dependant choices using the instruction quote to higher forms of choice. Finally, a last result, which is joint work with Laura Fontanella, complements the previous one by adapting the countable antichain condition from forcing to classical realizability, which seems to open a new, promising approach to the problem of realizing the full axiom of choice.
|
Page generated in 0.0285 seconds