11 |
Analyse phonographématique de l'Arabe en vue d'applications informatiquesChelyah, Hassane. January 1994 (has links)
Thesis (doctoral)--Université de Paris VII, 1994. / eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (p. [179]-185).
|
12 |
Lambda-kalkul jako nástroj pro metaprogramování v C++ / λ-calculus as a Tool for Metaprogramming in C++Šefl, Vít January 2016 (has links)
The template system of C++ is expressive enough to allow the programmer to write programs which are evaluated during compile time. This can be exploited for example in generic programming. However, these programs are very often hard to write, read and maintain. We introduce a simple translation from lambda calculus into C++ templates and show how it can be used to simplify C++ metaprograms. This variation of lambda calculus is then extended with Hindley-Milner type system and various other features (Haskell-like syntax, user-defined data types, tools for interaction with existing C++ template code and so on). We then build a compiler capable of transforming programs written in this language into C++ template metaprograms. Powered by TCPDF (www.tcpdf.org)
|
13 |
Lambda Calculus for Binary Security and AnalysisStaursky, Joseph N. 30 September 2021 (has links)
No description available.
|
14 |
Normalisation & equivalence in proof theory & type theoryLengrand, Stéphane J. E. January 2006 (has links)
At the heart of the connections between Proof Theory and Type Theory, the Curry-Howard correspondence provides proof-terms with computational features and equational theories, i.e. notions of normalisation and equivalence. This dissertation contributes to extend its framework in the directions of proof-theoretic formalisms (such as sequent calculus) that are appealing for logical purposes like proof-search, powerful systems beyond propositional logic such as type theories, and classical (rather than intuitionistic) reasoning. Part I is entitled Proof-terms for Intuitionistic Implicational Logic. Its contributions use rewriting techniques on proof-terms for natural deduction (Lambda-calculus) and sequent calculus, and investigate normalisation and cut-elimination, with call-by-name and call-by-value semantics. In particular, it introduces proof-term calculi for multiplicative natural deduction and for the depth-bounded sequent calculus G4. The former gives rise to the calculus Lambdalxr with explicit substitutions, weakenings and contractions that refines the Lambda-calculus and Beta-reduction, and preserves strong normalisation with a full notion of composition of substitutions. The latter gives a new insight to cut-elimination in G4. Part II, entitled Type Theory in Sequent Calculus develops a theory of Pure Type Sequent Calculi (PTSC), which are sequent calculi that are equivalent (with respect to provability and normalisation) to Pure Type Systems but better suited for proof-search, in connection with proof-assistant tactics and proof-term enumeration algorithms. Part III, entitled Towards Classical Logic, presents some approaches to classical type theory. In particular it develops a sequent calculus for a classical version of System F_omega. Beyond such a type theory, the notion of equivalence of classical proofs becomes crucial and, with such a notion based on parallel rewriting in the Calculus of Structures, we compute canonical representatives of equivalent proofs.
|
15 |
Une étude combinatoire du lambda-calcul avec ressources uniforme / A combinatory study of uniforme resource lambda-calculusMidez, Jean baptiste 15 December 2014 (has links)
Le lambda-calcul avec ressources est une variante du lambda-calcul fondée sur la linéarité : les lambda-termes avec ressources sont aux lambda-termes ce que sont les polynômes aux fonctions réelles, c'est à dire des approximations multi-linéaires. En particulier les réductions dans le lambda-calcul avec ressources peuvent être vues comme des approximations des beta-réductions, mais la contrainte de linéarite a des conséquences importantes, notamment la forte normalisation de la réduction avec ressources. Pour ainsi dire, la beta-réduction est obtenue par passage à la limite des réductions avec ressources qui l'approximent. Cette thèse étudie les aspects combinatoires, très riches, du lambda-calcul avec ressources. On commence par définir précisément la notion de réduction avec ressource associée à une beta-réduction: étant donné un lambda-terme $t$, un approximant $s$ de celui-ci et $t'$ une beta-réduction de $t$, on lui associe une réduction avec ressources (appelée gamma-réduction) de $s$ qui réduit les «mêmes» redex que celle de $t$ et produit un ensemble $S'$ d'approximants de $t'$. Cette définition permet de retrouver une preuve légèrement plus intuitive de l'un des théorèmes fondamentaux de la théorie, qui permet également de le généraliser. Dans un second temps on étudie les relations «familiales» entre termes avec ressources, la question centrale étant de caractériser le fait que deux termes avec ressources sont des réduits d'un même terme. Ce problème central et difficile n'est pas pleinement résolu, mais la thèse présente plusieurs résultats préliminaires et développe les bases d'une théorie pour arriver à cette fin. / The resource lambda-calculus is a variant of lambda-calculus based on linearity: resource lambda-terms are to lambda-terms as polynomials are to real functions. In particular reductions in resource lambda-calculus can be viewed as approximations of beta-reductions. But the linearity constraint has important consequences, especially the strong normalisation of resource reduction. So to speak, beta-reduction is obtained by passage to the limit of resource reduction which approximates it. This thesis is a study of the combinatory aspect of resource lambda-calculus. First, we define precisely the notion of resource reduction associated to beta-reduction: let t be a lambda-term, s an approximant of t and t' a beta-reduction of t, we associate a resource reduction (called gamma-reduction) of s which reducts the "same" redex as the beta-reduction of t and this generates a set S' of approximants of t'. This definition allows to find a new proof (who is more intuitive) of one of the fundamental theorems of this theory and it also allows to generalize it. Then we study the "family" relations between resource lambda-terms. The main question is to characterize the resource lambda-terms which are reducts of same term. This central problem is hard and not completely resolved, but this thesis exhibits several preliminary results and lays the foundations of a theory aimed at resolving it.
|
16 |
The safe lambda calculusBlum, William January 2009 (has links)
We consider a syntactic restriction for higher-order grammars called safety that constrains occurrences of variables in the production rules according to their type-theoretic order. We transpose and generalize this restriction to the setting of the simply-typed lambda calculus, giving rise to what we call the safe lambda calculus. We analyze its expressivity and obtain a result in the same vein as Schwichtenberg's 1976 characterization of the simply-typed lambda calculus: the numeric functions representable in the safe lambda calculus are exactly the multivariate polynomials; thus conditional is not definable. We also give a similar characterization for representable word functions. We then examine the complexity of deciding beta-eta equality of two safe simply-typed terms and show that this problem is PSPACE-hard. The safety restriction is then extended to other applied lambda calculi featuring recursion and references such as PCF and Idealized Algol (IA for short). The next contribution concerns game semantics. We introduce a new concrete presentation of this semantics using the theory of traversals. It is shown that the revealed game denotation of a term can be computed by traversing some souped-up version of the term's abstract syntax tree using adequately defined traversal rules. Based on this presentation and via syntactic reasoning we obtain a game-semantic interpretation of safety: the strategy denotations of safe lambda-terms satisfy a property called P-incremental justification which says that the player's moves are always justified by the last pending opponent's move of greater order occurring in the player's view. Next we look at models of the safe lambda calculus. We show that these are precisely captured by Incremental Closed Categories. A game model is constructed and is shown to be fully abstract for safe IA. Further, it is effectively presentable: two terms are equivalent just if they have the same set of complete O-incrementally justified plays---where O-incremental justification is defined as the dual of P-incremental justification. Finally we study safety from the point of view of algorithmic game semantics. We observe that in the third-order fragment of IA, the addition of unsafe contexts is conservative for observational equivalence. This implies that all the upper complexity bounds known for the lower-order fragments of IA also hold for the safe fragment; we show that the lower-bounds remain the same as well. At order 4, observational equivalence is known to be undecidable for IA. We conjecture that for the order-4 safe fragment of IA, the problem is reducible to the DPDA-equivalence problem and is thus decidable.
|
17 |
Higher-order queries and applicationsVu, Quoc Huy January 2012 (has links)
Higher-order transformations are ubiquitous within data management. In relational databases, higher-order queries appear in numerous aspects including query rewriting and query specification. In XML databases, higher-order functions are natural due to the close connection of XML query languages with functional programming. The thesis investigates higher-order query languages that combine higher- order transformations with ordinary database query languages. We de- fine higher-order query languages based on Relational Algebra, Monad Algebra, and XQuery. The thesis also studies basic problems for these query languages including evaluation, containment, and type inference. We show that even though evaluating these higher-order query languages is non-elementary, there are subclasses that are polynomially reducible to evaluation for ordinary query languages. Our theoretical analysis is complemented by an implementation of the languages, our Higher-Order Mapping Evaluation System (HOMES). The system integrates querying and query transformation in a single higher- order query language. It allows users to write queries that integrate and combine query transformations. The system is implemented on top of traditional database management systems. The evaluation algorithm is optimized by a combination of subquery caching techniques from relational and XML databases and sharing detection schemes from functional programming.
|
18 |
Du typage vectoriel / On vectorial typingDiaz Caro, Alejandro 23 September 2011 (has links)
L'objectif de cette thèse est de développer une théorie de types pour le λ-calcul linéaire-algébrique, une extension du λ-calcul motivé par l'informatique quantique. Cette extension algébrique comprend tous les termes du λ-calcul plus leurs combinaisons linéaires, donc si t et r sont des termes, α.t+β.r est aussi un terme, avec α et β des scalaires pris dans un anneau. L'idée principale et le défi de cette thèse était d'introduire un système de types où les types, de la même façon que les termes, constituent un espace vectoriel, permettant la mise en évidence de la structure de la forme normale d'un terme. Cette thèse présente le système Lineal , ainsi que trois systèmes intermédiaires, également intéressants en eux-même : Scalar, Additive et λCA, chacun avec leurs preuves de préservation de type et de normalisation forte. / The objective of this thesis is to develop a type theory for the linear-algebraic λ-calculus, an extension of λ-calculus motivated by quantum computing. This algebraic extension encompass all the terms of λ-calculus together with their linear combinations, so if t and r are two terms, so is α.t + β.r, with α and β being scalars from a given ring. The key idea and challenge of this thesis was to introduce a type system where the types, in the same way as the terms, form a vectorial space, providing the information about the structure of the normal form of the terms. This thesis presents the system Lineal, and also three intermediate systems, however interesting by themselves: Scalar, Additive and λCA, all of them with their subject reduction and strong normalisation proofs.
|
19 |
Development and verification of probability logics and logical frameworksMaksimovic, Petar 15 October 2013 (has links) (PDF)
The research for this thesis has followed two main paths: the one of probability logics and the other of type systems and logical frameworks, bringing them together through interactive theorem proving. With the development of computer technology and the need to capture real-world dynamics, situations, and problems, reasoning under uncertainty has become one of the more important research topics of today, and one of the tools for formalizing this kind of knowledge are probability logics. Given that probability logics, serving as decision-making or decision-support systems, often form a basis for expert systems that find their application in fields such as game theory or medicine, their correct functioning is of great importance, and formal verification of their properties would add an additional level of security to the design process. On the other hand, in the field of logical frameworks and interactive theorem proving, attention has been directed towards a more natural way of encoding formal systems where derivation rules are subject to side conditions which are either rather difficult or impossible to encode naively, in the Edinburgh Logical Framework \LF or any other type-theory based Logical Framework, due to their inherent limitations, or to the fact that the formal systems in question need to access the derivation context, or the structure of the derivation itself, or other structures and mechanisms not available at the object level. The first part of the thesis deals with the development and formal verification of probability logics. First, we introduce a Probability Logic with Conditional Operators - LPCP, its syntax, semantics, and a sound and strongly-complete axiomatic system, featuring an infinitary inference rule. We prove the obtained formalism decidable, and extend it so as to represent evidence, making it the first propositional axiomatization of reasoning about evidence. Next, we show how to encode probability logics LQI and LQnI in the Proof Assistant Coq. Both of these logics extend classical logic with modal-like probability operators, and both feature an infinitary inference rule. LQI allows iterations of probability operators, while LQnI does not. We proceed to formally verify their key properties - soundness, strong completeness, and non-compactness. In this way, we formally justify the use of probabilistic SAT-solvers for the checking of consistency-related questions. In the second part of the thesis, we present LFP - a Logical Framework with External Predicates, by introducing a mechanism for locking and unlocking types and terms into LF, allowing the use of external oracles. We prove that LFP satisfies all of the main meta-theoretical properties (strong normalization, confluence, subject reduction, decidability of type checking). We develop a corresponding canonical framework, allowing for easy proofs of encoding adequacy. We provide a number of encodings - the simple untyped lambda-calculus with a Call-by-Value reduction strategy, the Design-by-Contract paradigm, a small imperative language with Hoare Logic, Modal Logics in Hilbert and Natural Deduction style, and Non-Commutative Linear Logic (encoded for the first time in an LF-like framework), illustrating that in LFP we can encode side-conditions on the application of rules elegantly, and achieve a separation between derivation and computation, resulting in cleaner and more readable proofs. We believe that the results presented in this thesis can serve as a foundation for fruitful future research. On the one hand, the obtained formal correctness proofs add an additional level of security when it comes to the construction of expert systems constructed using the verified logics, and pave way for further formal verification of other probability logics. On the other hand, there is room for further improvement, extensions, and deeper analysis of the LFP framework, as well as the building of a prototype interactive theorem prover based on LFP and discovering its place in the world of proof assistants.
|
20 |
Développement et Vérification des Logiques Probabilistes et des Cadres LogiquesMaksimovic, Petar 15 October 2013 (has links) (PDF)
On présente une Logique Probabiliste avec des Operateurs Conditionnels - LPCP, sa syntaxe, sémantique, axiomatisation correcte et fortement complète, comprenant une règle de déduction infinitaire. On prouve que LPCP est décidable, et on l'étend pour qu'il puisse représenter l'évidence, en créant ainsi la première axiomatisation propositionnelle du raisonnement basé sur l'évidence. On codifie les Logiques Probabilistes LPP1Q et LPPQ2 dans l'Assistant de Preuve Coq, et on vérifie formellement leurs propriétés principales: correction, complétude fort et non-compacité. Les deux logiques étendent la Logique Classique avec des opérateurs de probabilité, et présentent une règle de déduction infinitaire. LPPQ1 permet des itérations des opérateurs de probabilité, lorsque LPPQ2 ne le permet pas. On a formellement justifié l'utilisation des solveurs SAT probabilistes pour vérifier les questions liées à la cohérence. On présente LFP, un Cadre Logique avec Prédicats Externes, en introduisant un mécanisme pour bloquer et débloquer types et termes dans LF, en permettant l'utilisation d'oracles externes. On démontre que LFP satisfait tous les principales propriétés et on développe un cadre canonique correspondant, qui permet de prouver l'adéquation. On fournit diverses encodages - le λ-calcul non-typé avec la stratégie de réduction CBV, Programmation-par-Contrats, un langage impératif avec la Logique de Hoare, des Logiques Modales et la Logique Linéaire Non-Commutative, en montrant que en LFP on peut codifier aisément des side-conditions dans l'application des règles de typage et atteindre une séparation entre vérification et computation, en obtenant des preuves plus claires et lisibles.
|
Page generated in 0.0725 seconds