• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 2
  • Tagged with
  • 111
  • 20
  • 14
  • 9
  • 8
  • 8
  • 7
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

The principle of predicate exchangeability in pure inductive logic

Kliess, Malte Sebastian January 2014 (has links)
We investigate the Principle of Predicate Exchangeability in the framework of Pure Inductive Logic. While this principle was known to Rudolf Carnap, who started research in Inductive Logic, the principle has been somewhat neglected in the past. After providing the framework of Pure Inductive Logic, we will show Representation Theorems for probability functions satisfying Predicate Exchangeability, filling the gap in the list of Representation Theorems for functions satisfying certain rational principles. We then introduce a new principle, called the Principle of Strong Predicate Exchangeability, which is weaker than the well-known Principle of Atom Exchangeability, but stronger than Predicate Exchangeability and give examples of functions that satisfy this principle. Finally, we extend the framework of Inductive Logic to Second Order languages, which allows for increasing a rational agent’s expressive strength. We introduce Wilmers’ Principle, a rational principle that rational agents might want to adopt in this extended framework, and give a representation theorem for this principle.
82

Automated discovery of inductive lemmas

Johansson, Moa January 2009 (has links)
The discovery of unknown lemmas, case-splits and other so called eureka steps are challenging problems for automated theorem proving and have generally been assumed to require user intervention. This thesis is mainly concerned with the automated discovery of inductive lemmas. We have explored two approaches based on failure recovery and theory formation, with the aim of improving automation of firstand higher-order inductive proofs in the IsaPlanner system. We have implemented a lemma speculation critic which attempts to find a missing lemma using information from a failed proof-attempt. However, we found few proofs for which this critic was applicable and successful. We have also developed a program for inductive theory formation, which we call IsaCoSy. IsaCoSy was evaluated on different inductive theories about natural numbers, lists and binary trees, and found to successfully produce many relevant theorems and lemmas. Using a background theory produced by IsaCoSy, it was possible for IsaPlanner to automatically prove more new theorems than with lemma speculation. In addition to the lemma discovery techniques, we also implemented an automated technique for case-analysis. This allows IsaPlanner to deal with proofs involving conditionals, expressed as if- or case-statements.
83

Canonical extensions of bounded lattices and natural duality for default bilattices

Craig, Andrew Philip Knott January 2012 (has links)
This thesis presents results concerning canonical extensions of bounded lattices and natural dualities for quasivarieties of default bilattices. Part I is dedicated to canonical extensions, while Part II focuses on natural duality for default bilattices. A canonical extension of a lattice-based algebra consists of a completion of the underlying lattice and extensions of the additional operations to the completion. Canonical extensions find rich application in providing an algebraic method for obtaining relational semantics for non-classical logics. Part I gives a new construction of the canonical extension of a bounded lattice. The construction is done via successive applications of functors and thus provides an elegant exposition of the fact that the canonical extension is functorial. Many existing constructions are described via representation and duality theorems. We demonstrate precisely how our new formulation relates to existing constructions as well as proving new results about complete lattices constructed from graphs. Part I ends with an analysis of the untopologised structures used in two methods of construction of canonical extensions of bounded lattices: the untopologised graphs used in our new construction, and the so-called `intermediate structure'. We provide sufficient conditions for the intermediate structure to be a lattice and, for the case of finite lattices, identify when the dual graph is not a minimal representation of the lattice. Part II applies techniques from natural duality theory to obtain dualities for quasivarieties of bilattices used in default logic. Bilattices are doubly-ordered algebraic structures which find application in reasoning about inconsistent and incomplete information. This account is the first attempt to provide dualities or representations when there is little interaction required between the two orders. Our investigations begin by using computer programs to calculate dualities for specific examples, before using purely theoretical techniques to obtain dualities for more general cases. The results obtained are extremely revealing, demonstrating how one of the lattice orders from the original algebra is encoded in the dual structure. We conclude Part II by describing a new class of default bilattices. These provide an alternative way of interpreting contradictory information. We obtain dualities for two newly-described quasivarieties and provide insights into how these dual structures relate to previously described classes of dual structures for bilattices.
84

Development and evaluation of computer-aided assessment in discrete and decision mathematics

Zaczek, Kinga January 2015 (has links)
This thesis describes the development of Computer-Aided Assessment questions for elementary discrete and decision mathematics at the school/university interface, stressing the pedagogy behind the questions’ design and the development of methodology for assessing their efficacy in improving students’ engagement and perceptions, as well as on their exams results. The questions give instant and detailed feedback and hence are valuable as diagnostic, formative or summative tools. A total of 275 questions were designed and coded for five topics, numbers, sets, logic, linear programming and graph theory, commonly taught to students of mathematics, computer science, engineering and management. Pedagogy and programming problems with authoring questions were resolved and are discussed in specific topic contexts and beyond. The delivery of robust and valid objective questions, even within the constraints of CAA, is therefore feasible. Different question types and rich feedback comprising text, equations and diagrams that allow random parameters to produce millions of realisations at run time, can give CAA an important role in teaching mathematics at this level. Questionnaires identified that CAA was generally popular with students, with the vast majority seeing CAA not only as assessment but also as a learning resource. To test the impact of CAA on students’ learning, an analysis of the exam scripts quantified its effect on class means and standard deviations. This also identified common student errors, which fed into the question design and editing processes by providing evidence-based mal-rules. Four easily-identified indicators (correctly-written remainders, conversion of binary/octal/hexadecimal numbers, use of correct set notation {…} and consistent layout of truth tables) were examined in student exam scripts to find out if the CAA helps students to improve examination answers. The CAA answer files also provided the questions’ facilities and discriminations, potentially giving teachers specific information on which to base and develop their teaching and assessment strategies. We conclude that CAA is a successful tool for the formative/summative assessment of mathematics at this level and has a positive effect on students’ learning.
85

Advanced mathematics and deductive reasoning skills : testing the Theory of Formal Discipline

Attridge, Nina January 2013 (has links)
This thesis investigates the Theory of Formal Discipline (TFD): the idea that studying mathematics develops general reasoning skills. This belief has been held since the time of Plato (2003/375B.C), and has been cited in recent policy reports (Smith, 2004; Walport, 2010) as an argument for why mathematics should hold a privileged place in the UK's National Curriculum. However, there is no rigorous research evidence that justifies the claim. The research presented in this thesis aims to address this shortcoming. Two questions are addressed in the investigation of the TFD: is studying advanced mathematics associated with development in reasoning skills, and if so, what might be the mechanism of this development? The primary type of reasoning measured is conditional inference validation (i.e. `if p then q; not p; therefore not q'). In two longitudinal studies it is shown that the conditional reasoning behaviour of mathematics students at AS level and undergraduate level does change over time, but that it does not become straightforwardly more normative. Instead, mathematics students reason more in line with the `defective' interpretation of the conditional, under which they assume p and reason about q. This leads to the assumption that not-p cases are irrelevant, which results in the rejection of two commonly-endorsed invalid inferences, but also in the rejection of the valid modus tollens inference. Mathematics students did not change in their reasoning behaviour on a thematic syllogisms task or a thematic version of the conditional inference task. Next, it is shown that mathematics students reason significantly less in line with a defective interpretation of the conditional when it is phrased `p only if q' compared to when it is phrased `if p then q', despite the two forms being logically equivalent. This suggests that their performance is determined by linguistic features rather than the underlying logic. The final two studies investigated the heuristic and algorithmic levels of Stanovich's (2009a) tri-process model of cognition as potential mechanisms of the change in conditional reasoning skills. It is shown that mathematicians' defective interpretation of the conditional stems in part from heuristic level processing and in part from effortful processing, and that the executive function skills of inhibition and shifting at the algorithmic level are correlated with its adoption. It is suggested that studying mathematics regularly exposes students to implicit `if then' statements where they are expected to assume p and reason about q, and that this encourages them to adopt a defective interpretation of conditionals. It is concluded that the TFD is not supported by the evidence; while mathematics does seem to develop abstract conditional reasoning skills, the result is not more normative reasoning.
86

Fuzzy land cover change detection and validation : a comparison of fuzzy and Boolean analyses in Tripoli City, Libya

Khmag, Abdulhakim Emhemad January 2013 (has links)
This research extends fuzzy methods to consider the fuzzy validation of fuzzy land cover data at the sub-pixel level. The study analyses the relationships between fuzzy memberships generated by field survey and those generated from the classification of remotely sensed data. In so doing it examines the variations in the relationship between observed and predicted fuzzy land cover classes. This research applies three land cover classification techniques: Fuzzy sets, Fuzzy c-means and Boolean classification, and develops three models to determine fuzzy land cover change. The first model is dependent on fuzzy object change. The second model depends on the sub-pixel change through a fuzzy change matrix, for both fuzzy sets and fuzzy c-means, to compute the fuzzy change, fuzzy loss and fuzzy gain. The third model is a Boolean change model which evaluates change on a pixel-by-pixel basis. The results show that using a fuzzy change analysis presents a subtle way of mapping a heterogeneous area with common mixed pixels. Furthermore, the results show that the fuzzy change matrix gives more detail and information about land cover change and is more appropriate than fuzzy object change because it deals with sub-pixel change. Finally the research has found that a fuzzy error matrix is more suitable than an error matrix for soft classification validation because it can compare the membership from the field with the classified image. From this research there arise some important points: • Fuzzy methodologies have the ability to define the uncertainties associated with describing the phenomenon itself and the ability to take into consideration the effect of mixed pixels. • This research compared fuzzy sets and fuzzy c-means, and found the fuzzy set is more suit-able than fuzzy c-means, because the latter suffers from some disadvantages, chiefly that the sum of membership values of a data point in all the clusters must be one, so the algorithm has difficulty in handling outlying points. • This research validates fuzzy classifications by determining the fuzzy memberships in the field and comparing them with the memberships derived from the classified image.
87

Large-N reduced models of SU(N) lattice guage theories

Vairinhos, Hélvio January 2007 (has links)
No description available.
88

Type theory in a type theory with quotient inductive types

Kaposi, Ambrus January 2017 (has links)
Type theory (with dependent types) was introduced by Per Martin-Löf with the intention of providing a foundation for constructive mathematics. A part of constructive mathematics is type theory itself, hence we should be able to say what type theory is using the formal language of type theory. In addition, metatheoretic properties of type theory such as normalisation should be provable in type theory. The usual way of defining type theory formally is by starting with an inductive definition of precontexts, pretypes and preterms and as a second step defining a ternary typing relation over these three components. Well-typed terms are those preterms for which there exists a precontext and pretype such that the relation holds. However, if we use the rich metalanguage of type theory to talk about type theory, we can define well-typed terms directly as an inductive family indexed over contexts and types. We believe that this latter approach is closer to the spirit of type theory where objects come intrinsically with their types. Internalising a type theory with dependent types is challenging because of the mutual definitions of types, terms, substitution of terms and the conversion relation. We use induction induction to express this mutual dependency. Furthermore, to reduce the type-theoretic boilerplate needed for reasoning in the syntax, we encode the conversion relation as the equality type of the syntax. We use equality constructors thus we define the syntax as a quotient inductive type (a special case of higher inductive types from homotopy type theory). We define the syntax of a basic type theory with dependent function space, a base type and a family over the base type as a quotient inductive inductive type. The definition of the syntax comes with a notion of model and an eliminator: whenever one is able to define a model, the eliminator provides a function from the syntax to the model. We show that this method of representing type theory is practically feasible by defining a number of models: the standard model, the logical predicate interpretation for parametricity (as a syntactic translation) and the proof-relevant presheaf logical predicate interpretation. By extending the latter with a quote function back into the syntax, we prove normalisation for type theory. This can be seen as a proof of normalisation by evaluation. Internalising the syntax of type theory is not only of theoretical interest. It opens the possibility of type-theoretic metaprogramming in a type-safe way. This could be used for generic programming in type theory and to implement extensions of type theory which are justified by models such as guarded type theory or homotopy type theory.
89

Practical reasoning in probabilistic description logic

Klinov, Pavel January 2011 (has links)
Description Logics (DLs) form a family of languages which correspond to decidable fragments of First-Order Logic (FOL). They have been overwhelmingly successful for constructing ontologies - conceptual structures describing domain knowledge. Ontologies proved to be valuable in a range of areas, most notably, bioinformatics, chemistry, Health Care and Life Sciences, and the Semantic Web.One limitation of DLs, as fragments of FOL, is their restricted ability to cope with various forms of uncertainty. For example, medical knowledge often includes statistical relationships, e.g., findings or results of clinical trials. Currently it is maintained separately, e.g., in Bayesian networks or statistical models. This often hinders knowledge integration and reuse, leads to duplication and, consequently, inconsistencies.One answer to this issue is probabilistic logics which allow for smooth integration of classical, i.e., expressible in standard FOL or its sub-languages, and uncertain knowledge. However, probabilistic logics have long been considered impractical because of discouraging computational properties. Those are mostly due to the lack of simplifying assumptions, e.g., independence assumptions which are central to Bayesian networks.In this thesis we demonstrate that deductive reasoning in a particular probabilistic DL, called P-SROIQ, can be computationally practical. We present a range of novel algorithms, in particular, the probabilistic satisfiability procedure (PSAT) which is, to our knowledge, the first scalable PSAT algorithm for a non-propositional probabilistic logic. We perform an extensive performance and scalability evaluation on different synthetic and natural data sets to justify practicality.In addition, we study theoretical properties of P-SROIQ by formally translating it into a fragment of first-order logic of probability. That allows us to gain a better insight into certain important limitations of P-SROIQ. Finally, we investigate its applicability from the practical perspective, for instance, use it to extract all inconsistencies from a real rule-based medical expert system.We believe the thesis will be of interest to developers of probabilistic reasoners. Some of the algorithms, e.g., PSAT, could also be valuable to the Operations Research community since they are heavily based on mathematical programming. Finally, the theoretical analysis could be helpful for designers of future probabilistic logics.
90

An axiom system for a spatial logic with convexity

Trybus, Adam January 2012 (has links)
A spatial logic is any formal language with geometric interpretation. Research on region-based spatial logics, where variables are set to range over certain subsets of geometric space, have been investigated recently within the qualitative spatial reasoning paradigm in AI. We axiomatised the theory of (ROQ(R 2), conv, ≤) , where ROQ(R 2) is the set of regular open rational polygons of the real plane; conv is the convexity property and ≤ is the inclusion relation. We proved soundness and completeness theorems. We also proved several expressiveness results. Additionally, we provide a historical and philosophical overview of the topic and present contemporary results relating to affine spatial logics.

Page generated in 0.0231 seconds