Spelling suggestions: "subject:"lambdacalculus"" "subject:"lambdacalc""
31 |
Environment Analysis of Higher-Order LanguagesMight, Matthew Brendon 29 June 2007 (has links)
Any analysis of higher-order languages must grapple with the
tri-facetted nature of lambda. In one construct, the fundamental
control, environment and data structures of a language meet and
intertwine. With the control facet tamed nearly two decades ago, this
work brings the environment facet to heel, defining the environment
problem and developing its solution: environment analysis. Environment
analysis allows a compiler to reason about the equivalence of
environments, i.e., name-to-value mappings, that arise during a
program's execution. In this dissertation, two different
techniques-abstract counting and abstract frame strings-make this
possible. A third technique, abstract garbage collection, makes both
of these techniques more precise and, counter to intuition, often
faster as well. An array of optimizations and even deeper analyses
which depend upon environment analysis provide motivation for this
work.
In an abstract interpretation, a single abstract entity represents a
set of concrete entities. When the entities under scrutiny are
bindings-single name-to-value mappings, the atoms of environment-then
determining when the equality of two abstract bindings infers the
equality of their concrete counterparts is the crux of environment
analysis. Abstract counting does this by tracking the size of
represented sets, looking for singletons, in order to apply the
following principle:
If {x} = {y}, then x = y.
Abstract frame strings enable environmental reasoning by statically
tracking the possible stack change between the births of two
environments; when this change is effectively empty, the environments
are equivalent. Abstract garbage collection improves precision by
intermittently removing unreachable environment structure during
abstract interpretation.
|
32 |
Aditivní dvojice v kvantitativní teorii typů / Additive Pairs in Quantitative Type TheorySvoboda, Tomáš January 2021 (has links)
Both dependent types and linear types have their desirable properties. Department types can express functional dependencies of inputs and outputs, while linear types offer control over the use of computational resources. Combining these two systems have been difficult because of their different interpretations of context presence of variables. Quantitative Type Theory (QTT) combines dependent types and linear types by using a semiring to track the kind of use of every resource. We extend QTT with the additive pair and additive unit types, express the complete QTT rules in bidirectional form, and then present our interpreter of a simple language based on QTT. 1
|
33 |
Higher-order semantics for quantum programming languages with classical controlAtzemoglou, George Philip January 2012 (has links)
This thesis studies the categorical formalisation of quantum computing, through the prism of type theory, in a three-tier process. The first stage of our investigation involves the creation of the dagger lambda calculus, a lambda calculus for dagger compact categories. Our second contribution lifts the expressive power of the dagger lambda calculus, to that of a quantum programming language, by adding classical control in the form of complementary classical structures and dualisers. Finally, our third contribution demonstrates how our lambda calculus can be applied to various well known problems in quantum computation: Quantum Key Distribution, the quantum Fourier transform, and the teleportation protocol.
|
34 |
Modélisation logique de la langue et grammaires catégorielles abstraites / Logic modeling of language and Abstract Categorial GrammarsPompigne, Florent 11 December 2013 (has links)
Cette thèse s'intéresse à la modélisation de la syntaxe et de l'interface syntaxe-sémantique de la phrase, et explore la possibilité de contrôler au niveau des structures de dérivation la surgénération que produit le traitement des dépendances à distance par des types d'ordre supérieur. À cet effet, nous étudions la possibilité d'étendre le système de typage des Grammaires Catégorielles Abstraites avec les constructions de la somme disjointe, du produit cartésien et du produit dépendant, permettant d'étiqueter les catégories syntaxiques par des structures de traits. Nous prouvons dans un premier temps que le calcul résultant de cette extension bénéficie des propriétés de confluence et de normalisation, permettant d'identifier les termes beta-équivalents dans le formalisme grammatical. Nous réduisons de plus le même problème pour la beta-eta-équivalence à un ensemble d'hypothèse de départ. Dans un second temps, nous montrons comment cette introduction de structures de traits peut être appliquée au contrôle des dépendances à distances, à travers les exemples des contraintes de cas, des îlots d'extraction pour les mouvements explicites et implicites, et des extractions interrogatives multiples, et nous discutons de la pertinence de placer ces contrôles sur les structures de dérivation / This thesis focuses on the modelisation of syntax and syntax-semantics interface of sentences, and investigate how the control of the surgeneration caused by the treatment of linguistics movements with higher order types can take place at the level of derivation structures. For this purpose, we look at the possibility to extend the type system of Abstract Categorial Grammars with the constructions of disjoint sum, cartesian product and dependent product, which enable syntactic categories to be labeled by feature structures. At first, we demonstrate that the calculus associated with this extension enjoy the properties of confluence and normalization, by which beta-equivalence can be computed in the grammatical formalism. We also reduce the same problem for beta-eta-equivalence to a few hypothesis. Then, we show how this feature structures can be used to control linguistics movements, through the examples of case constraints, extraction islands for overt and covert movements and multiples interrogative extractions, and we discuss the relevancy of operating these controls on the derivation structures
|
35 |
Accessibilité des référents en sémantique du discours / Accessibility of Referents in Discourse SemanticsQian, Sai 07 November 2014 (has links)
Cette thèse prend ses racines dans la tradition sémantique montagovienne et dynamique standard. L’objet est les conditions dans lesquelles un syntagme nominal peut agir comme antécédent d'une expression anaphorique. Le travail porte sur l'accessibilité des référents de discours dans un système formel de la sémantique dynamique. Le cadre choisi est celui proposé par De Groote, type théorique Dynamic Logic (TTDL) car il fait appel à des outils mathématiques et logiques standards, qui permettent de conserver le principe de compositionnalité. Nous étendons la couverture de la gestion de l’accessibilité des référents dans TTDL à deux cas naturellement problématique pour les théories sémantiques dynamiques classiques, en particulier, l'anaphore sous la double négation et les modalités. Une adaptation est définie pour chaque cas et enfin, l'intégration des différentes solutions est proposée, ce qui montre la souplesse de TTDL. / This thesis has its roots in the standard Montagovian and dynamic semantic tradition. The subject is conditions under which a noun phrase may act as antecedent of a particular anaphoric expression. The work thesis deals with the accessibility of discourse referents using a formal system of dynamic semantics. The framework used is the one proposed by De Groote, Type Theoretic Dynamic Logic (TTDL) because it follows the Montagovian tradition and only makes use of standard mathematical and logical tools which allows to maintain compositionnality. We extend the coverage of TTDL to cases which are naturally problematic for classical dynamic semantic theories. In particularly, this thesis aims to extend TTDL's coverage of the accessibility of referents to two exceptions of classical dynamic theories, namely anaphora under double negation and modality. An adaptation is defined for each case and finally, an integration of various solutions is proposed, which shows the flexibility of TTDL
|
36 |
Mise au point d'un formalisme syntaxique de haut niveau pour le traitement automatique des langues / A high-level syntactic formalism for natural language processingKirman, Jerome 04 December 2015 (has links)
La linguistique informatique a pour objet de construire un modèle formel des connaissances linguistiques, et d’en tirer des algorithmes permettant le traitement automatique des langues. Pour ce faire, elle s’appuie fréquemment sur des grammaires dites génératives, construisant des phrases valides par l’application successive de règles de réécriture. Une approche alternative, basée sur la théorie des modèles, vise à décrire la grammaticalité comme une conjonction de contraintes de bonne formation, en s’appuyant sur des liens profonds entre logique et automates pour produire des analyseurs efficaces. Notre travail se situe dans ce dernier cadre. En s’appuyant sur plusieurs résultats existants en informatique théorique, nous proposons un outil de modélisation linguistique expressif, conçu pour faciliter l’ingénierie grammaticale. Celui-ci considère dans un premier temps la structure abstraite des énoncés, et fournit un langage logique s’appuyant sur les propriétés lexicales des mots pour caractériser avec concision l’ensemble des phrases grammaticalement correctes. Puis, dans un second temps, le lien entre ces structures abstraites et leurs représentations concrètes (en syntaxe et en sémantique) est établi par le biais de règles de linéarisation qui exploitent la logique et le lambda-calcul. Par suite, afin de valider cette approche, nous proposons un ensemble de modélisations portant sur des phénomènes linguistiques divers, avec un intérêt particulier pour le traitement des langages présentant des phénomènes d’ordre libre (c’est-à-dire qui autorisent la permutation de certains mots ou groupes de mots dans une phrase sans affecter sa signification), ainsi que pour leur complexité algorithmique. / The goal of computational linguistics is to provide a formal account linguistical knowledge, and to produce algorithmic tools for natural languageprocessing. Often, this is done in a so-called generative framework, where grammars describe sets of valid sentences by iteratively applying some set of rewrite rules. Another approach, based on model theory, describes instead grammaticality as a set of well-formedness logical constraints, relying on deep links between logic and automata in order to produce efficient parsers. This thesis favors the latter approach. Making use of several existing results in theoretical computer science, we propose a tool for linguistical description that is both expressive and designed to facilitate grammar engineering. It first tackles the abstract structure of sentences, providing a logical language based on lexical properties of words in order to concisely describe the set of grammaticaly valid sentences. It then draws the link between these abstract structures and their representations (both in syntax and semantics), through the use of linearization rules that rely on logic and lambda-calculus. Then in order to validate this proposal, we use it to model various linguistic phenomenas, ending with a specific focus on languages that include free word order phenomenas (that is, sentences which allow the free reordering of some of their words or syntagmas while keeping their meaning), and on their algorithmic complexity.
|
37 |
Intesection types and resource control in the intuitionistic sequent lambda calculus / Типови са пресеком и контрола ресурса у интуиционистичком секвентном ламбда рачуну / Tipovi sa presekom i kontrola resursa u intuicionističkom sekventnom lambda računuIvetić Jelena 09 October 2013 (has links)
<p>This thesis studies computational interpretations of the intuitionistic sequent<br />calculus with implicit and explicit structural rules, with focus on the systems<br />with intersection types. The contributions of the thesis are grouped into three<br />parts. In the first part intersection types are introduced into the lambda<br />Gentzen calculus. The second part presents an extension of the lambda<br />Gentzen calculus to a term calculus with resource control, i.e. with explicit<br />operators for contraction and weakening, and apropriate intersection type<br />assignment system which characterises strong normalisation in the proposed<br />calculus. In the third part both previously studied calculi are integrated into<br />one framework by introducing the notion of the resource control cube.</p> / <p>Ова дисертација се бави рачунским интерпретацијама<br />интуиционистичког секвентног рачуна са имплицитним и експлицитним<br />структурним правилима, са фокусом на типске системе са пресеком.<br />Оригинални резултати тезе су груписани у три целине. У првом делу су<br />типови са пресеком уведени у lambda Gentzen рачун. Други део<br />представља проширење lambda Gentzen рачуна на формални рачун са<br />контролом ресурса, тј. са експлицитним операторима контракције и<br />слабљења, као и одговарајући типски систем са пресеком који<br />карактерише јаку нормализацију у уведеном рачуну. У трећем делу оба<br />рачуна су интегрисана у заједнички оквир увођењем структуре resource<br />control cube.</p> / <p>Ova disertacija se bavi računskim interpretacijama<br />intuicionističkog sekventnog računa sa implicitnim i eksplicitnim<br />strukturnim pravilima, sa fokusom na tipske sisteme sa presekom.<br />Originalni rezultati teze su grupisani u tri celine. U prvom delu su<br />tipovi sa presekom uvedeni u lambda Gentzen račun. Drugi deo<br />predstavlja proširenje lambda Gentzen računa na formalni račun sa<br />kontrolom resursa, tj. sa eksplicitnim operatorima kontrakcije i<br />slabljenja, kao i odgovarajući tipski sistem sa presekom koji<br />karakteriše jaku normalizaciju u uvedenom računu. U trećem delu oba<br />računa su integrisana u zajednički okvir uvođenjem strukture resource<br />control cube.</p>
|
38 |
Fundamentação computacional da matemática intervalarAcioly, Benedito Melo January 1991 (has links)
A Matemática Intervalar se assenta em dois conceitos fundamentais, a propriedade da inclusão-monotonicidade de sua aritmética e uma topologia de Hausdorff definida no conjunto dos intervalos. A propriedade da inclusão-monotonicidade tem se revelado uma ferramenta útil na elaboração de algoritmos intervalares, enquanto a topologia de Hausdorff não consegue refletir as características lógicas daquela propriedade, comprometendo, desse modo, a construção de uma lógica cujo modelo seria a estrutura intervalar munida dessa topologia. Essa lógica seria necessária para fundamentação da matemática intervalar como uma teoria de algorítmos da análise real. Neste trabalho se mostra que o insucesso na construção dessa fundamentação se deve a incompatibilidade entre a propriedade da inclusão-monotonicidade e a topologia de Hausdorff. A partir dessa constatação se descarta essa topologia e define-se uma outra topologia - a topologia de Scott - que é compatível com essa propriedade, no sentido de que todo resultado obtido usando-se a lógica, isto é, a propriedade da inclusão-monotonicidade, obtém-se também usando-se a ferramenta topológica e reciprocamente. A teoria resultante da substituição da topologia de Hausdorff pela topologia de Scott tem duas características fundamentais. A Análise Funcional Intervalar resultante possui a maioria das propriedades interessantes da Análise Real, suprimindo, assim, as deficiências da Análise Intervalar anterior. A elaboração da propriedade da inclusão-monotoniciadade permite construir uma lógica geométrica e uma teoria lambda cujo modelo é essa nova matemática intervalar. Além disso, a partir dessa lógica e da teoria lambda se elabora uma teoria construtiva, como a teoria dos tipos de Martin-Löf, que permite se raciocinar com programas dessa matemática. Isso significa a possibilidade de se fazer correção automática de programas da matemática intervalar. Essa nova abordagem da matemática intervalar é desenvolvida pressupondo, apenas, o conceito de número racional, além, é claro, da linguagem da teoria dos conjuntos. Desse modo é construído o sistema intervalar de um modo análogo ao sistema real. Para isso é generalizado o conceito de corte de Dedekind, resultando dessa construção um sistema ordenado denominado de quasi-corpo, em contraste com o números reais cujo sistema é algébrico, o corpo dos números reais. Assim, no sistema intervalar a ordem é um conceito intrínseco ao sistema, diferentemente do sistema de números reais cuja a ordem não faz parte da álgebra do sistema. A lógica dessa nova matemática intervalar é uma lógica categórica. Isto significa que todo resultado obtido para domínios básicos se aplica para o produto cartesiano, união disjunta, o espaço de funções, etc., desses domínios. Isto simplifica consideravelmente a teoria. Um exemplo dessa simplificação é a definição de derivada nessa nova matemática intervalar, conceito ainda não bem definido na teoria intervalar clássica. / The Interval Mathematics is based on two fundamental concepts, the inclusion-monotonicity of its arithmetics and a Hausdorff topology defined on the interval set. The property of inclusion-monotonicity has risen as an useful tool for elaboration of interval algorithms. In contrast, because the Hausdorff topology does not reflect the logical features of that property, the interval mathematics did not, permit the elaboration of a logic whose model is this interval mathematics with that topology. This logic should be necessary to the foundation of the interval mathematics as a Real Analysis Theory of Algorithms. This thesis shows that the theory of algorithms refered above was not possible because of the incompatibility between the property of inclusion-monotonicity and the Hausdorff topology. By knowing the shortcoming of this topology, the next step is to set it aside and to define a new topology - the Scott topology - compatible with the refered property in the sense that every result, obtained via the logic is also obtainable via the topology and vice-versa. After changing the topology the resulting theory has two basic features. The Interval Functional Analysis has got the most, interesting properties belonging to Real Analysis, supressing the shortcomings of previous interval analysis. The elaboration of the inclusion-monotonicity property allows one to construct a geometric logic and a lambda theory whose model is this new interval mathematics. From this logic and from the lambda theory a constructive theory is then elaborated, similar to Martin-Löf type theory, being possible then to reason about programs of this new interval mathematics. This means the possibility of automatically checking the correctness of programs of interval mathematics. This new approach assumes only the concept, of rational numbers beyond, of course, the set theory language. It is constructed an interval system similar to the real system. A general notion of the concept of Dedekind cut was necessary to reach that. The resulting construction is an ordered system which will be called quasi-field, in opposition to the real numbers system which is algebraic. Thus, in the interval system the order is an intrinsic concept, unlike the real numbers sistems whose order does not belong to the algebraic system. The logic of this new interval mathematics is a categorical logic. This means that, every result got for basic domains applies also to cartesian product, disjoint union, function spaces, etc., of these domains. This simplifies considerably the new theory. An example of this simplication is given by the definition of derivative, a concept not, derived by the classical interval theory.
|
39 |
Fundamentação computacional da matemática intervalarAcioly, Benedito Melo January 1991 (has links)
A Matemática Intervalar se assenta em dois conceitos fundamentais, a propriedade da inclusão-monotonicidade de sua aritmética e uma topologia de Hausdorff definida no conjunto dos intervalos. A propriedade da inclusão-monotonicidade tem se revelado uma ferramenta útil na elaboração de algoritmos intervalares, enquanto a topologia de Hausdorff não consegue refletir as características lógicas daquela propriedade, comprometendo, desse modo, a construção de uma lógica cujo modelo seria a estrutura intervalar munida dessa topologia. Essa lógica seria necessária para fundamentação da matemática intervalar como uma teoria de algorítmos da análise real. Neste trabalho se mostra que o insucesso na construção dessa fundamentação se deve a incompatibilidade entre a propriedade da inclusão-monotonicidade e a topologia de Hausdorff. A partir dessa constatação se descarta essa topologia e define-se uma outra topologia - a topologia de Scott - que é compatível com essa propriedade, no sentido de que todo resultado obtido usando-se a lógica, isto é, a propriedade da inclusão-monotonicidade, obtém-se também usando-se a ferramenta topológica e reciprocamente. A teoria resultante da substituição da topologia de Hausdorff pela topologia de Scott tem duas características fundamentais. A Análise Funcional Intervalar resultante possui a maioria das propriedades interessantes da Análise Real, suprimindo, assim, as deficiências da Análise Intervalar anterior. A elaboração da propriedade da inclusão-monotoniciadade permite construir uma lógica geométrica e uma teoria lambda cujo modelo é essa nova matemática intervalar. Além disso, a partir dessa lógica e da teoria lambda se elabora uma teoria construtiva, como a teoria dos tipos de Martin-Löf, que permite se raciocinar com programas dessa matemática. Isso significa a possibilidade de se fazer correção automática de programas da matemática intervalar. Essa nova abordagem da matemática intervalar é desenvolvida pressupondo, apenas, o conceito de número racional, além, é claro, da linguagem da teoria dos conjuntos. Desse modo é construído o sistema intervalar de um modo análogo ao sistema real. Para isso é generalizado o conceito de corte de Dedekind, resultando dessa construção um sistema ordenado denominado de quasi-corpo, em contraste com o números reais cujo sistema é algébrico, o corpo dos números reais. Assim, no sistema intervalar a ordem é um conceito intrínseco ao sistema, diferentemente do sistema de números reais cuja a ordem não faz parte da álgebra do sistema. A lógica dessa nova matemática intervalar é uma lógica categórica. Isto significa que todo resultado obtido para domínios básicos se aplica para o produto cartesiano, união disjunta, o espaço de funções, etc., desses domínios. Isto simplifica consideravelmente a teoria. Um exemplo dessa simplificação é a definição de derivada nessa nova matemática intervalar, conceito ainda não bem definido na teoria intervalar clássica. / The Interval Mathematics is based on two fundamental concepts, the inclusion-monotonicity of its arithmetics and a Hausdorff topology defined on the interval set. The property of inclusion-monotonicity has risen as an useful tool for elaboration of interval algorithms. In contrast, because the Hausdorff topology does not reflect the logical features of that property, the interval mathematics did not, permit the elaboration of a logic whose model is this interval mathematics with that topology. This logic should be necessary to the foundation of the interval mathematics as a Real Analysis Theory of Algorithms. This thesis shows that the theory of algorithms refered above was not possible because of the incompatibility between the property of inclusion-monotonicity and the Hausdorff topology. By knowing the shortcoming of this topology, the next step is to set it aside and to define a new topology - the Scott topology - compatible with the refered property in the sense that every result, obtained via the logic is also obtainable via the topology and vice-versa. After changing the topology the resulting theory has two basic features. The Interval Functional Analysis has got the most, interesting properties belonging to Real Analysis, supressing the shortcomings of previous interval analysis. The elaboration of the inclusion-monotonicity property allows one to construct a geometric logic and a lambda theory whose model is this new interval mathematics. From this logic and from the lambda theory a constructive theory is then elaborated, similar to Martin-Löf type theory, being possible then to reason about programs of this new interval mathematics. This means the possibility of automatically checking the correctness of programs of interval mathematics. This new approach assumes only the concept, of rational numbers beyond, of course, the set theory language. It is constructed an interval system similar to the real system. A general notion of the concept of Dedekind cut was necessary to reach that. The resulting construction is an ordered system which will be called quasi-field, in opposition to the real numbers system which is algebraic. Thus, in the interval system the order is an intrinsic concept, unlike the real numbers sistems whose order does not belong to the algebraic system. The logic of this new interval mathematics is a categorical logic. This means that, every result got for basic domains applies also to cartesian product, disjoint union, function spaces, etc., of these domains. This simplifies considerably the new theory. An example of this simplication is given by the definition of derivative, a concept not, derived by the classical interval theory.
|
40 |
Automated verification of termination certificates / Vérification automatisée de certificats de terminaisonLy, Kim Quyen 09 October 2014 (has links)
S'assurer qu'un programme informatique se comporte bien, surtout dans des applications critiques (santé, transport, énergie, communications, etc.) est de plus en plus important car les ordinateurs et programmes informatiques sont de plus en plus omniprésents, voir essentiel au bon fonctionnement de la société. Mais comment vérifier qu'un programme se comporte comme prévu, quand les informations qu'il prend en entrée sont de très grande taille, voire de taille non bornée a priori ? Pour exprimer avec exactitude ce qu'est le comportement d'un programme, il est d'abord nécessaire d'utiliser un langage logique formel. Cependant, comme l'a montré Gödel dans, dans tout système formel suffisamment riche pour faire de l'arithmétique, il y a des formules valides qui ne peuvent pas être prouvées. Donc il n'y a pas de programme qui puisse décider si toute propriété est vraie ou fausse. Cependant, il est possible d'écrire un programme qui puisse vérifier la correction d'une preuve. Ce travail utilisera justement un tel programme, Coq, pour formellement vérifier la correction d'un certain programme. Dans cette thèse, nous expliquons le développement d'une nouvelle version de Rainbow, plus rapide et plus sûre, basée sur le mécanisme d'extraction de Coq. La version précédente de Rainbow vérifiait un certificat en deux étapes. Premièrement, elle utilisait un programme OCaml non certifié pour traduire un fichier CPF en un script Coq, en utilisant la bibliothèque Coq sur la théorie de la réécriture et la terminaison appelée CoLoR. Deuxièmement, elle appelait Coq pour vérifier la correction du script ainsi généré. Cette approche est intéressante car elle fournit un moyen de réutiliser dans Coq des preuves de terminaison générée par des outils extérieurs à Coq. C'est également l'approche suivie par CiME3. Mais cette approche a aussi plusieurs désavantages. Premièrement, comme dans Coq les fonctions sont interprétées, les calculs sont beaucoup plus lents qu'avec un langage où les programmes sont compilés vers du code binaire exécutable. Deuxièmement, la traduction de CPF dans Coq peut être erronée et conduire au rejet de certificats valides ou à l'acceptation de certificats invalides. Pour résoudre ce deuxième problème, il est nécessaire de définir et prouver formellement la correction de la fonction vérifiant si un certificat est valide ou non. Et pour résoudre le premier problème, il est nécessaire de compiler cette fonction vers du code binaire exécutable. Cette thèse montre comment résoudre ces deux problèmes en utilisant l'assistant à la preuve Coq et son mécanisme d'extraction vers le langage de programmation OCaml. En effet, les structures de données et fonctions définies dans Coq peuvent être traduits dans OCaml et compilées en code binaire exécutable par le compilateur OCaml. Une approche similaire est suivie par CeTA en utilisant l'assistant à la preuve Isabelle et le langage Haskell. / Making sure that a computer program behaves as expected, especially in critical applications (health, transport, energy, communications, etc.), is more and more important, all the more so since computer programs become more and more ubiquitous and essential to the functioning of modern societies. But how to check that a program behaves as expected, in particular when the range of its inputs is very large or potentially infinite? In this work, we explain the development of a new, faster and formally proved version of Rainbow based on the extraction mechanism of Coq. The previous version of Rainbow verified a CPF le in two steps. First, it used a non-certified OCaml program to translate a CPF file into a Coq script, using the Coq libraries on rewriting theory and termination CoLoR and Coccinelle. Second, it called Coq to check the correctness of the script. This approach is interesting for it provides a way to reuse in Coq termination proofs generated by external tools. This is also the approach followed by CiME3. However, it suffers from a number of deficiencies. First, because in Coq functions are interpreted, computation is much slower than with programs written in a standard programming language and compiled into binary code. Second, because the translation from CPF to Coq is not certified, it may contain errors and either lead to the rejection of valid certificates, or to the acceptance of wrong certificates. To solve the latter problem, one needs to define and formally prove the correctness of a function checking whether a certificate is valid or not. To solve the former problem, one needs to compile this function to binary code. The present work shows how to solve these two problems by using the proof assistant Coq and its extraction mechanism to the programming language OCaml. Indeed, data structures and functions de fined in Coq can be translated to OCaml and then compiled to binary code by using the OCaml compiler. A similar approach was first initiated in CeTA using the Isabelle proof assistant.
|
Page generated in 0.1647 seconds