• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 12
  • 3
  • 3
  • Tagged with
  • 49
  • 49
  • 20
  • 17
  • 16
  • 13
  • 11
  • 10
  • 10
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The partial lambda calculus

Moggi, Eugenio January 1988 (has links)
This thesis investigates various formal systems for reasoning about partial functions or partial elements, with particular emphasis on lambda calculi for partial functions. Beeson's (intuitionistic) logic of partial terms (LPT) is taken as the basic formal system and some of its metamathematical properties are established (for later application). Three different flavours of Scott's logic of partial elements (LPE) are considered and it is shown that they are conservative extensions of LPT. This result, we argue, corroborates the choice of LPT as the basic formal system. Variants of LPT are introduced for reasoning about partial terms with a restriction operator ↾, monotonic partial functions (monLPT), lambda-terms λ_p-calculus) and λY-terms λ_pμY-calculus). The expressive powers of some (in)equational fragments are compared in LPT and its variants. Two equational formal systems are related to some of the logics above: Obtulowicz's p-equational logic is related to LPT+↾ and Plotkin's λ_v-calculus is related to one flavour of LPE. The deductive powers of LPT and its variants are compared, using various techniques (among them logical relations). The main conclusion drawn from this comparison is that there are four different lambda calculi for partial functions: intuitionistic or classical, partial or monotonic partial functions. An (in)equational presentation of the intuitionistic lambda calculus for (monotonic) partial functions is given as an extension of p-equational logic. We conjecture that there is no equational presentation of the classical λ_p-calculus. Via a special kind of diamond property, the (in)equational formal system is characterized in terms of β-reduction for partial functions and some decidability problems are solved.
2

Type assignment in the lambda-calculus : Syntax and semantics

Ben-Yelles, C. B. January 1979 (has links)
No description available.
3

A new program for combinatory reduction and abstraction

Deshpande, Sushant, University of Lethbridge. Faculty of Arts and Science January 2009 (has links)
Even though lambda calculus (λ-calculus) and combinatory logic (CL) appear to be equivalent, they are not. As yet we do not have a reduction in CL which corresponds to β-reduction in λ-calculus. There are three proposals but they all have few problems one of which is the lack of a complete characterization of CL-terms corresponding to λ-terms in β-normal form. Finding such a characterization for any of the three proposals appears to require a lot of examples which are tedious and time consuming to develop by hand. For this reason, a computer program to do reductions and abstractions of CL-terms would be useful. This thesis is about an attempt to write such a program. The program that we have does not yet work for the three proposals but it works for βη-strong reduction. Coding this program turned out to be much harder than anticipated. Dr. Robin Cockett developed a semantic translation which helped in coding the program but his semantic translation needs to be extended to all three proposals to obtain the program originally desired and that needs a lot of research. / v, 96 leaves ; 29 cm
4

Structured representation of composite software changes

Chabra, Aarti 13 December 2011 (has links)
In a software development cycle, programs go through many iterations. Identifying and understanding program changes is a tedious but necessary task for programmers, especially when software is developed in a collaborative environment. Existing tools used by the programmers either lack in finding the structural differences, or report the differences as atomic changes, such as updates of individual syntax tree nodes. Programmers frequently use program restructuring techniques, such as refactorings that are composed of several individual atomic changes. Current version differencing tools omit these high-level changes, reporting just the set of individual atomic changes. When a large number of refactorings are performed, the number of reported atomic changes is very large. As a result, it will be very difficult to understand the program differences. This problem can be addressed by reporting the program differences as composite changes, thereby saving programmers the effort of navigating through the individual atomic changes. This thesis proposes a methodology to explore the atomic changes reported by existing version differencing tools to infer composite changes. First, we will illustrate the different approaches that can be used for representing object language program differences using a variation representation. Next we will present the process of composite change inference from the structured representation of atomic changes. This process describes patterns that specify the expected structure of an expression corresponding to each composite change that has to be inferred. The information in patterns is then used to design the change inference algorithm. The composite changes inferred from a given expression are annotated in the expression, allowing the changes to be reported as desired. / Graduation date: 2012
5

Fexprs as the basis of Lisp function application; or, $vau: the ultimate abstraction

Shutt, John N 01 September 2010 (has links)
"Abstraction creates custom programming languages that facilitate programming for specific problem domains. It is traditionally partitioned according to a two-phase model of program evaluation, into syntactic abstraction enacted at translation time, and semantic abstraction enacted at run time. Abstractions pigeon-holed into one phase cannot interact freely with those in the other, since they are required to occur at logically distinct times. Fexprs are a Lisp device that subsumes the capabilities of syntactic abstraction, but is enacted at run-time, thus eliminating the phase barrier between abstractions. Lisps of recent decades have avoided fexprs because of semantic ill-behavedness that accompanied fexprs in the dynamically scoped Lisps of the 1960s and 70s. This dissertation contends that the severe difficulties attendant on fexprs in the past are not essential, and can be overcome by judicious coordination with other elements of language design. In particular, fexprs can form the basis for a simple, well-behaved Scheme-like language, subsuming traditional abstractions without a multi-phase model of evaluation. The thesis is supported by a new Scheme-like language called Kernel, created for this work, in which each Scheme-style procedure consists of a wrapper that induces evaluation of operands, around a fexpr that acts on the resulting arguments. This arrangement enables Kernel to use a simple direct style of selectively evaluating subexpressions, in place of most Lisps' indirect quasiquotation style of selectively suppressing subexpression evaluation. The semantics of Kernel are treated through a new family of formal calculi, introduced here, called vau calculi. Vau calculi use direct subexpression-evaluation style to extend lambda calculus, eliminating a long-standing incompatibility between lambda calculus and fexprs that would otherwise trivialize their equational theories. The impure vau calculi introduce non-functional binding constructs and unconventional forms of substitution. This strategy avoids a difficulty of Felleisen's lambda-v-CS calculus, which modeled impure control and state using a partially non-compatible reduction relation, and therefore only approximated the Church-Rosser and Plotkin's Correspondence Theorems. The strategy here is supported by an abstract class of Regular Substitutive Reduction Systems, generalizing Klop's Regular Combinatory Reduction Systems."
6

Models and theories of lambda calculus

Manzonetto, Giulio 18 February 2008 (has links) (PDF)
Dans cette thèse on s'intéresse surtout aux sémantiques principales du λ-calcul (c'est- a-dire la sémantique continue de Scott, la sémantique stable, et la sémantique fortement stable) mais on introduit et étudie aussi deux nouvelles sémantiques : la sémantique relationnelle et la sémantique indécomposable. Les modèles du λ-calcul pur peuvent être définis soit comme des objets réflexifs dans des catégories Cartésiennes fermées (modèles catégoriques) soit comme des algèbres combinatoires satisfaisant les cinq axiomes de Curry et l'axiome de Meyer-Scott ( λ-modèles). En ce qui concerne les modèles catégoriques, on montre que tout modèle catégorique peut être présenté comme un λ-modèle, même si la ccc (catégorie Cartésienne fermée) sous-jacente n'a pas assez de points, et on donne des conditions su santes pour qu'un modèle catégorique vivant dans une ccc \cpo-enriched" arbitraire ait H pour théorie équationnelle. On construit un modèle catégorique qui vit dans une ccc d'ensembles et relations (sémantique relationnelle) et qui satisfait ces conditions. De plus, on montre que le λ-modèle associe possède des propriétés algébriques qui le rendent apte a modéliser des extensions non-déterministes du -calcul. En ce qui concerne les algèbres combinatoires, on montre qu'elles satisfont une généralisation du Théorème de Représentation de Stone qui dit que toute algèbre combinatoire est isomorphe a un produit Booléen faible d'algèbres combinatoires directement indécomposables. On étudie la sémantique du λ-calcul dont les modèles sont directement indécomposable comme algèbres combinatoires (sémantique indécomposable); on prouve en particulier que cette sémantique est assez générale pour inclure d'une part les trois sémantiques principales et d'autre part les modèles de termes de toutes les λ-théories semi-sensibles. Par contre, on montre aussi qu'elle est largement incomplète. Finalement, on étudie la question de l'existence d'un modèle non-syntaxique du λ-calcul appartenant aux sémantiques principales et ayant une théorie équationnelle ou inéquationnelle r.e. (récursivement énumérable). Cette question est une généralisation naturelle du problème de Honsell et Ronchi Della Rocca (ouvert depuis plus que vingt ans) concernant l'existence d'un modèle continu de λβ ou λβη. On introduit une notion adéquate de modèles effectifs du λ-calcul, qui couvre en particulier tous les modèles qui ont été introduits individuellement en littérature, et on prouve que la théorie inéquationnelle d'un modèle effectif n'est jamais r.e. ; en conséquence sa théorie équationnelle ne peut pas être λβ ou λβη. On montre aussi que la théorie équationnelle d'un modèle effectif vivant dans la sémantique stable ou fortement stable n'est jamais r.e. En ce qui concerne la sémantique continue de Scott, on démontre que la théorie in équationnelle d'un modèle de graphe n'est jamais r.e. et qu'il existe beaucoup de modèles de graphes effectifs qui ont une théorie équationnelle qui n'est pas r.e.
7

Encoding XQuery using <em>System F</em>

Xia, Yun January 2005 (has links)
Since the World Wide Web Consortium (W3C) has recommended XQuery as the standard XML query language, the interest in using existing relational technology to query the XML data has dramatically increased. The most significant challenge of the relational approach is how to fully support XQuery semantics in XQuery-to-SQL translation. To eliminate the implicit semantics of XQuery, an XQuery fragment must be defined with simple syntax and explicit semantics. XQ is proposed as an XQuery fragment to express XML queries. <br /><br /> In this thesis, XQ is intensively investigated. It is encoded by <em>System F</em>, a second-order lambda calculus with a considerable expressive power and a strong normalization property. Since XML data is defined as inductive data types, XML tree and XML forest, in <em>System F</em>, all basic XML operators in XQ have been successfully encoded. Also, the semantics of XQ are represented in <em>System F</em> where XQ's semantics environment is encoded by an <em>Environment</em> data type with the corresponding operators. The successful encoding of XQ by <em>System F</em> ensures the termination of XQ query evaluation. <br /><br /> Moreover, an extension of XQ by a new tree operator Xtree and a vertical Vfor clause is proposed in this thesis to express some <em>undefinable</em> XQ queries. It is demonstrated that this extension still allows XQ to retain its XQ-to-SQL translation property that ensures the polynomial evaluation time complexity, and its <em>System F</em> encodable property that ensures the termination of query evaluation.
8

Encoding XQuery using <em>System F</em>

Xia, Yun January 2005 (has links)
Since the World Wide Web Consortium (W3C) has recommended XQuery as the standard XML query language, the interest in using existing relational technology to query the XML data has dramatically increased. The most significant challenge of the relational approach is how to fully support XQuery semantics in XQuery-to-SQL translation. To eliminate the implicit semantics of XQuery, an XQuery fragment must be defined with simple syntax and explicit semantics. XQ is proposed as an XQuery fragment to express XML queries. <br /><br /> In this thesis, XQ is intensively investigated. It is encoded by <em>System F</em>, a second-order lambda calculus with a considerable expressive power and a strong normalization property. Since XML data is defined as inductive data types, XML tree and XML forest, in <em>System F</em>, all basic XML operators in XQ have been successfully encoded. Also, the semantics of XQ are represented in <em>System F</em> where XQ's semantics environment is encoded by an <em>Environment</em> data type with the corresponding operators. The successful encoding of XQ by <em>System F</em> ensures the termination of XQ query evaluation. <br /><br /> Moreover, an extension of XQ by a new tree operator Xtree and a vertical Vfor clause is proposed in this thesis to express some <em>undefinable</em> XQ queries. It is demonstrated that this extension still allows XQ to retain its XQ-to-SQL translation property that ensures the polynomial evaluation time complexity, and its <em>System F</em> encodable property that ensures the termination of query evaluation.
9

Analyse phonographématique de l'Arabe en vue d'applications informatiques

Chelyah, Hassane. January 1994 (has links)
Thesis (doctoral)--Université de Paris VII, 1994. / Includes bibliographical references (p. [179]-185). Also issued in print.
10

Normalisation & equivalence in proof theory & type theory /

Lengrand, Stéphane. January 2007 (has links)
Thesis (Ph.D.) - University of St Andrews, April 2007.

Page generated in 0.0262 seconds