• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Normalisation & equivalence in proof theory & type theory /

Lengrand, Stéphane. January 2007 (has links)
Thesis (Ph.D.) - University of St Andrews, April 2007.
2

Fully Generic Programming Over Closed Universes of Inductive-Recursive Types

Diehl, Larry 06 June 2017 (has links)
Dependently typed programming languages allow the type system to express arbitrary propositions of intuitionistic logic, thanks to the Curry-Howard isomorphism. Taking full advantage of this type system requires defining more types than usual, in order to encode logical correctness criteria into the definitions of datatypes. While an abundance of specialized types helps ensure correctness, it comes at the cost of needing to redefine common functions for each specialized type. This dissertation makes an effort to attack the problem of code reuse in dependently typed languages. Our solution is to write generic functions, which can be applied to any datatype. Such a generic function can be applied to datatypes that are defined at the time the generic function was written, but they can also be applied to any datatype that is defined in the future. Our solution builds upon previous work on generic programming within dependently typed programming. Type theory supports generic programming using a construction known as a universe. A universe can be considered the model of a programming language, such that writing functions over it models writing generic programs in the programming language. Historically, there has been a trade-off between the expressive power of the modeled programming language, and the kinds of generic functions that can be written in it. Our dissertation shows that no such trade-off is necessary, and that we can write future-proof generic functions in a model of a dependently typed programming language with a rich collection of types.
3

The Nax Language: Unifying Functional Programming and Logical Reasoning in a Language based on Mendler-style Recursion Schemes and Term-indexed Types

Ahn, Ki Yung 16 December 2014 (has links)
Two major applications of lambda calculi in computer science are functional programming languages and mechanized reasoning systems (or, proof assistants). According to the Curry--Howard correspondence, it is possible, in principle, to design a unified language based on a typed lambda calculus for both logical reasoning and programming. However, the different requirements of programming languages and reasoning systems make it difficult to design such a unified language that provides both. Programming languages usually extend lambda calculi with programming-friendly features (e.g., recursive datatypes, general recursion) for supporting the flexibility to model various computations, while sacrificing logical consistency. Logical reasoning systems usually extend lambda calculi with logic-friendly features (e.g., induction principles, dependent types) for paradox-free inference over fine-grained properties, while being more restrictive in modeling computations. In this dissertation, we design and implement a language called Nax that embraces benefits of both. Nax accepts all recursive datatypes, thus, allowing the same flexibility of defining recursive datatypes as in functional languages. Nax supports a number of Mendler-style recursion schemes that can express various kinds of recursive computations and also guarantee termination. Nax supports term-indexed types to support specifications of fine-grained properties. In addition, Nax supports a conservative extension of Hindley--Milner type inference. The theoretical contributions of this dissertation include theories for Mendler-style recursion schemes and term-indexed types, which we developed to establish strong normalization and logical consistency of Nax.
4

Normalisation & equivalence in proof theory & type theory

Lengrand, Stéphane J. E. January 2006 (has links)
At the heart of the connections between Proof Theory and Type Theory, the Curry-Howard correspondence provides proof-terms with computational features and equational theories, i.e. notions of normalisation and equivalence. This dissertation contributes to extend its framework in the directions of proof-theoretic formalisms (such as sequent calculus) that are appealing for logical purposes like proof-search, powerful systems beyond propositional logic such as type theories, and classical (rather than intuitionistic) reasoning. Part I is entitled Proof-terms for Intuitionistic Implicational Logic. Its contributions use rewriting techniques on proof-terms for natural deduction (Lambda-calculus) and sequent calculus, and investigate normalisation and cut-elimination, with call-by-name and call-by-value semantics. In particular, it introduces proof-term calculi for multiplicative natural deduction and for the depth-bounded sequent calculus G4. The former gives rise to the calculus Lambdalxr with explicit substitutions, weakenings and contractions that refines the Lambda-calculus and Beta-reduction, and preserves strong normalisation with a full notion of composition of substitutions. The latter gives a new insight to cut-elimination in G4. Part II, entitled Type Theory in Sequent Calculus develops a theory of Pure Type Sequent Calculi (PTSC), which are sequent calculi that are equivalent (with respect to provability and normalisation) to Pure Type Systems but better suited for proof-search, in connection with proof-assistant tactics and proof-term enumeration algorithms. Part III, entitled Towards Classical Logic, presents some approaches to classical type theory. In particular it develops a sequent calculus for a classical version of System F_omega. Beyond such a type theory, the notion of equivalence of classical proofs becomes crucial and, with such a notion based on parallel rewriting in the Calculus of Structures, we compute canonical representatives of equivalent proofs.

Page generated in 0.0825 seconds