• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 8
  • 5
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 78
  • 78
  • 78
  • 16
  • 15
  • 15
  • 13
  • 13
  • 11
  • 10
  • 10
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Logics of Knowledge and Cryptography : Completeness and Expressiveness

Cohen, Mika January 2007 (has links)
An understanding of cryptographic protocols requires that we examine the knowledge of protocol participants and adversaries: When a participant receives a message, does she know who sent it? Does she know that the message is fresh, and not merely a replay of some old message? Does a network spy know who is talking to whom? This thesis studies logics of knowledge and cryptography. Specifically, the thesis addresses the problem of how to make the concept of knowledge reflect feasible computability within a Kripke-style semantics. The main contributions are as follows. 1. A generalized Kripke semantics for first-order epistemic logic and cryptography, where the later is modeled using private constants and arbitrary cryptographic operations, as in the Applied Pi-calculus. 2. An axiomatization of first-order epistemic logic which is sound and complete relative to an underlying theory of cryptographic terms, and to an omega-rule for quantifiers. Besides standard axioms and rules from first-order epistemic logic, the axiomatization includes some novel axioms for the interaction between knowledge and cryptography. 3. Epistemic characterizations of static equivalence and Dolev-Yao message deduction. 4. A generalization of Kripke semantics for propositional epistemic logic and symmetric cryptography. 5. Decidability, soundness and completeness for propositional BAN-like logics with respect to message passing systems. Completeness and decidability are generalised to logics induced from an arbitrary base of protocol specific assumptions. 6. An epistemic definition of message deduction. The definition lies between weaker and stronger versions of Dolev-Yao deduction, and coincides with weaker Dolev-Yao regarding all atomic messages. For composite messages, the definition withstands a well-known counterexample to Dolev-Yao deduction. 7. Protocol examples using mixes, a Crowds style protocol, and electronic payments. / QC 20100524
52

Learning with Markov logic networks : transfer learning, structure learning, and an application to Web query disambiguation

Mihalkova, Lilyana Simeonova 18 March 2011 (has links)
Traditionally, machine learning algorithms assume that training data is provided as a set of independent instances, each of which can be described as a feature vector. In contrast, many domains of interest are inherently multi-relational, consisting of entities connected by a rich set of relations. For example, the participants in a social network are linked by friendships, collaborations, and shared interests. Likewise, the users of a search engine are related by searches for similar items and clicks to shared sites. The ability to model and reason about such relations is essential not only because better predictive accuracy is achieved by exploiting this additional information, but also because frequently the goal is to predict whether a set of entities are related in a particular way. This thesis falls within the area of Statistical Relational Learning (SRL), which combines ideas from two traditions within artificial intelligence, first-order logic and probabilistic graphical models to address the challenge of learning from multi-relational data. We build on one particular SRL model, Markov logic networks (MLNs), which consist of a set of weighted first-order-logic formulae and provide a principled way of defining a probability distribution over possible worlds. We develop algorithms for learning of MLN structure both from scratch and by transferring a previously learned model, as well as an application of MLNs to the problem of Web query disambiguation. The ideas we present are unified by two main themes: the need to deal with limited training data and the use of bottom-up learning techniques. Structure learning, the task of automatically acquiring a set of dependencies among the relations in the domain, is a central problem in SRL. We introduce BUSL, an algorithm for learning MLN structure from scratch that proceeds in a more bottom-up fashion, breaking away from the tradition of top-down learning typical in SRL. Our approach first constructs a novel data structure called a Markov network template that is used to restrict the search space for clauses. Our experiments in three relational domains demonstrate that BUSL dramatically reduces the search space for clauses and attains a significantly higher accuracy than a structure learner that follows a top-down approach. Accurate and efficient structure learning can also be achieved by transferring a model obtained in a source domain related to the current target domain of interest. We view transfer as a revision task and present an algorithm that diagnoses a source MLN to determine which of its parts transfer directly to the target domain and which need to be updated. This analysis focuses the search for revisions on the incorrect portions of the source structure, thus speeding up learning. Transfer learning is particularly important when target-domain data is limited, such as when data on only a few individuals is available from domains with hundreds of entities connected by a variety of relations. We also address this challenging case and develop a general transfer learning approach that makes effective use of such limited target data in several social network domains. Finally, we develop an application of MLNs to the problem of Web query disambiguation in a more privacy-aware setting where the only information available about a user is that captured in a short search session of 5-6 previous queries on average. This setting contrasts with previous work that typically assumes the availability of long user-specific search histories. To compensate for the scarcity of user-specific information, our approach exploits the relations between users, search terms, and URLs. We demonstrate the effectiveness of our approach in the presence of noise and show that it outperforms several natural baselines on a large data set collected from the MSN search engine. / text
53

Axiomatized Relationships between Ontologies

Chui, Carmen 21 November 2013 (has links)
This work focuses on the axiomatized relationships between different ontologies of varying levels of expressivity. Motivated by experiences in the decomposition of first-order logic ontologies, we partially decompose the Descriptive Ontology for Linguistic and Cognitive Engineering (DOLCE) into modules. By leveraging automated reasoning tools to semi-automatically verify the modules, we provide an account of the meta-theoretic relationships found between DOLCE and other existing ontologies. As well, we examine the composition process required to determine relationships between DOLCE modules and the Process Specification Language (PSL) ontology. Then, we propose an ontology based on the semantically-weak Computer Integrated Manufacturing Open System Architecture (CIMOSA) framework by augmenting its constructs with terminology found in PSL. Finally, we attempt to map two semantically-weak product ontologies together to analyze the applications of ontology mappings in e-commerce.
54

Axiomatized Relationships between Ontologies

Chui, Carmen 21 November 2013 (has links)
This work focuses on the axiomatized relationships between different ontologies of varying levels of expressivity. Motivated by experiences in the decomposition of first-order logic ontologies, we partially decompose the Descriptive Ontology for Linguistic and Cognitive Engineering (DOLCE) into modules. By leveraging automated reasoning tools to semi-automatically verify the modules, we provide an account of the meta-theoretic relationships found between DOLCE and other existing ontologies. As well, we examine the composition process required to determine relationships between DOLCE modules and the Process Specification Language (PSL) ontology. Then, we propose an ontology based on the semantically-weak Computer Integrated Manufacturing Open System Architecture (CIMOSA) framework by augmenting its constructs with terminology found in PSL. Finally, we attempt to map two semantically-weak product ontologies together to analyze the applications of ontology mappings in e-commerce.
55

Αλληλεπιδραστικό σύστημα μετατροπής προτάσεων φυσικής γλώσσας σε κατηγορηματική λογική πρώτης τάξης με αυτόματη εισαγωγή προτάσεων και δημιουργία υποδείξεων για το χρήστη

Περίκος, Ισίδωρος 07 April 2011 (has links)
Η αναπαράσταση γνώσης αποτελεί ένα σημαντικό πεδίο της τεχνητής νοημοσύνης. Ενώ η αναπαράσταση γνώσης για τον κόσμο στην καθημερινή ζωή μας γίνεται σε φυσική γλώσσα, για τα υπολογιστικά συστήματα είναι απαραίτητο να χρησιμοποιηθεί ένας συμβολισμός που να παρέχει ακριβή αναπαράσταση της γνώσης, κάτι που δεν μπορεί να παρέχει η φυσική γλώσσα λόγω της πολυσημαντικότητας των προτάσεων. Μια γλώσσα αναπαράστασης είναι η Κατηγορηματική Λογική Πρώτης Τάξης –ΚΛΠΤ (First Order Logic-FOL). Η ΚΛΠΤ ως γλώσσα αναπαράσταση γνώσης και αυτομάτου συλλογισμού έχει πολλές πτυχές. Μια από αυτές με την οποία ασχολούμαστε στην παρούσα διπλωματική είναι η μετατροπή φυσικής γλώσσας (ΦΓ) σε Κατηγορηματική Λογική Πρώτης Τάξης (ΚΛΠΤ). Πρόκειται για μια ad-hoc διαδικασία, για την οποία δεν υπάρχει κάποιος συγκεκριμένος αλγόριθμος. Στα πλαίσια της παρούσας διπλωματικής εργασίας αναπτύχθηκε ένα σύστημα το οποίο μοντελοποιεί την διαδικασία της μετατροπή φυσικής γλώσσας (ΦΓ) σε κατηγορηματική λογική (ΚΛΠΤ) και αυτοματοποιεί την διαδικασία εισαγωγής προτάσεων-παραδειγμάτων για τον χρήστη-διδάσκοντα. Παράλληλα μέσω μιας αλληλεπιδραστικής διεπαφής (User Interface) κατευθύνει τον χρήστη-φοιτητή κατά την διάρκεια της μετατροπής παρέχοντας βοήθειες και υποδείξεις για κάθε πρόταση. Ο χρήστης-διδάσκοντας μπορεί να εισάγει προτάσεις-παραδείγματα σε ΚΛΠΤ στο σύστημα. Στην συνέχεια κάθε πρόταση ΚΛΠΤ αναλύεται αυτόματα στα βήματα της διαδικασίας και αποθηκεύονται τα κατάλληλα στοιχεία. Μια άλλη πτυχή της διπλωματικής αποτελεί η υλοποίηση της ημι-αυτοματοποίησης της παραγωγής κατάλληλων υποδείξεων σε όλα βήματα της διαδικασίας για κάθε πρόταση. Για την υλοποίηση αυτή χρειάστηκε να γίνει μια κατηγοριοποίηση των επιπέδων των παρεχόμενων υποδείξεων και μια τυποποίηση των λεκτικών εκφράσεων των αντίστοιχων μηνυμάτων. / Knowledge Representation is a fundamental topic of Artificial Intelligence. In everyday life people use natural language to communicate, however natural language cannot be used for knowledge representation in computer systems. The main reason is that natural language has not clear semantic. A basic KR language is First-Order Logic (FOL), the main representative of logic-based representation languages, which is part of almost any introductory AI course and textbook. Teaching FOL as a knowledge representation and reasoning language includes many aspects. One of them is the translation of natural language (NL) sentences into FOL formulas, often called logic formalization of NL sentences. It is an ad-hoc process; there is no specific algorithm that can be automated within a computer. This is mainly due to the fact that NL has no clear semantics as FOL does. During this master thesis, a web-based interactive system has been developed. It’s main aim is to provide a structured process to students and guide them in translating a NL sentence into a FOL one. Also an assistant system has been created to automate the insert of new sentences into the system. The teacher can insert the sentence in natural language and it’s FOL formula. Then the formula is automatically analyzed and the necessary information for the translation is extracted and stored. Another work done during this master thesis is the implementation of a semi-automatic help generation system. The aim of this system is to recognize the students’ errors and provide them help and guidelines during the stages of the conversion process.
56

Validating reasoning heuristics using next generation theorem provers

Steyn, Paul Stephanes 31 January 2009 (has links)
The specification of enterprise information systems using formal specification languages enables the formal verification of these systems. Reasoning about the properties of a formal specification is a tedious task that can be facilitated much through the use of an automated reasoner. However, set theory is a corner stone of many formal specification languages and poses demanding challenges to automated reasoners. To this end a number of heuristics has been developed to aid the Otter theorem prover in finding short proofs for set-theoretic problems. This dissertation investigates the applicability of these heuristics to next generation theorem provers. / Computing / M.Sc. (Computer Science)
57

Mechanising knot Theory

Prathamesh, Turga Venkata Hanumantha January 2014 (has links) (PDF)
Mechanisation of Mathematics refers to use of computers to generate or check proofs in Mathematics. It involves translation of relevant mathematical theories from one system of logic to another, to render these theories implementable in a computer. This process is termed formalisation of mathematics. Two among the many ways of mechanising are: 1 Generating results using automated theorem provers. 2 Interactive theorem proving in a proof assistant which involves a combination of user intervention and automation. In the first part of this thesis, we reformulate the question of equivalence of two Links in first order logic using braid groups. This is achieved by developing a set of axioms whose canonical model is the braid group on infinite strands B∞. This renders the problem of distinguishing knots and links, amenable to implementation in first order logic based automated theorem provers. We further state and prove results pertaining to models of braid axioms. The second part of the thesis deals with formalising knot Theory in Higher Order Logic using the interactive proof assistant -Isabelle. We formulate equivalence of links in higher order logic. We obtain a construction of Kauffman bracket in the interactive proof assistant called Isabelle proof assistant. We further obtain a machine checked proof of invariance of Kauffman bracket.
58

Tableau-based reasoning for decidable fragments of first-order logic

Reker, Hilverd Geert January 2012 (has links)
Automated deduction procedures for modal logics, and related decidable fragments of first-order logic, are used in many real-world applications. A popular way of obtaining decision procedures for these logics is to base them on semantic tableau calculi. We focus on calculi that use unification, instead of the more widely employed approach of generating ground instantiations over the course of a derivation. The most common type of tableaux with unification are so-called free-variable tableaux, where variables are treated as global to the entire tableau. A long-standing open problem for procedures based on free-variable tableaux is how to ensure fairness, in the sense that "equivalent" applications of the closure rule are prevented from being done over and over again. Some solutions such as using depth-first iterative deepening are known, but those are unnecessary in theory, and not very efficient in practice. This is a main reason why there are hardly any decision procedures for modal logics based on free-variable tableaux. In this thesis, we review existing work on incorporating unification into first-order and modal tableau procedures, show how the closure fairness problem arises, and discuss existing solutions to it. For the first-order case, we outline a calculus which addresses the closure fairness problem. As opposed to free-variable tableaux, closure fairness is much easier to achieve in disconnection tableaux and similar clausal calculi. We therefore focus on using clausal first-order tableau calculi for decidable classes, in particular the two-variable fragment. Using the so-called unrestricted blocking mechanism for enforcing termination, we present the first ground tableau decision procedure for this fragment. Even for such a ground calculus, guaranteeing that depth-first terminations terminate is highly non-trivial. We parametrise our procedure by a so-called lookahead amount, and prove that this parameter is crucial for determining whether depth-first derivations terminate or not. Extending these ideas to tableaux with unification, we specify a preliminary disconnection tableau procedure which uses a non-grounding version of the unrestricted blocking rule.
59

Generation of Formal Specifications for Avionic Software Systems

Gulati, Pranav 02 October 2020 (has links)
Development of software for electronic systems in the aviation industry is strongly regulated by pre-defined standards. The aviation industry spends significant costs of development in ensuring flight safety and showing conformance to these standards. Some safety requirements can be satisfied by performing formal verification. Formal verification is seen as a way to reduce costs of showing conformance of software with the requirements or formal specifications. Therefore, the correctness of formal specifications is critical. Writing formal specifications is at least as difficult as developing software [36]. This work proposes an approach to generate formal specifications from example data. This example data illustrates the natural language requirements and represents the ground truth about the system. This work eases the task of an engineer who has to write formal specifications by allowing the engineer to specify the example data instead. The use of a relationship model and a marking syntax and semantics are proposed that make the creation of formal specifications goal oriented. The evaluation of the approach shows that the proposed syntax and semantics capture more information than is strictly needed to generate formal specifications. The relationship model reduces the computational load and only produces formal specifications that are interesting for the engineer.
60

Formal methods adoption in the commercial world

Nemathaga, Aifheli 10 1900 (has links)
There have been numerous studies on formal methods but little utilisation of formal methods in the commercial world. This can be attributed to many factors, such as that few specialists know how to use formal methods. Moreover, the use of mathematical notation leads to the perception that formal methods are difficult. Formal methods can be described as system design methods by which complex computer systems are built using mathematical notation and logic. Formal methods have been used in the software development world since 1940, that is to say, from the earliest stage of computer development. To date, there has been a slow adoption of formal methods, which are mostly used for mission-critical projects in, for example, the military and the aviation industry. Researchers worldwide are conducting studies on formal methods, but the research mostly deals with path planning and control and not the runtime verification of autonomous systems. The main focus of this dissertation is the question of how to increase the pace at which formal methods are adopted in the business or commercial world. As part of this dissertation, a framework was developed to facilitate the use of formal methods in the commercial world. The framework mainly focuses on education, support tools, buy-in and remuneration. The framework was validated using a case study to illustrate its practicality. This dissertation also focuses on different types of formal methods and how they are used, as well as the link between formal methods and other software development techniques. An ERP system specification is presented in both natural language (informal) and formal notation, which demonstrates how a formal specification can be derived from an informal specification using the enhanced established strategy for constructing a Z specification as a guideline. Success stories of companies that are applying formal methods in the commercial world are also presented. / School of Computing

Page generated in 0.0579 seconds