• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 3
  • Tagged with
  • 16
  • 16
  • 16
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Axiomatizing Confident GCIs of Finite Interpretations

Borchmann, Daniel 10 September 2012 (has links) (PDF)
Constructing description logic ontologies is a difficult task that is normally conducted by experts. Recent results show that parts of ontologies can be constructed from description logic interpretations. However, these results assume the interpretations to be free of errors, which may not be the case for real-world data. To provide some mechanism to handle these errors, the notion of confidence from data mining is introduced into description logics, yielding confident general concept inclusions (confident GCIs) of finite interpretations. The main focus of this work is to prove the existence of finite bases of confident GCIs and to describe some of theses bases explicitly.
2

On the Computation of Common Subsumers in Description Logics

Turhan, Anni-Yasmin 30 May 2008 (has links) (PDF)
Description logics (DL) knowledge bases are often build by users with expertise in the application domain, but little expertise in logic. To support this kind of users when building their knowledge bases a number of extension methods have been proposed to provide the user with concept descriptions as a starting point for new concept definitions. The inference service central to several of these approaches is the computation of (least) common subsumers of concept descriptions. In case disjunction of concepts can be expressed in the DL under consideration, the least common subsumer (lcs) is just the disjunction of the input concepts. Such a trivial lcs is of little use as a starting point for a new concept definition to be edited by the user. To address this problem we propose two approaches to obtain "meaningful" common subsumers in the presence of disjunction tailored to two different methods to extend DL knowledge bases. More precisely, we devise computation methods for the approximation-based approach and the customization of DL knowledge bases, extend these methods to DLs with number restrictions and discuss their efficient implementation.
3

Neuronale Netze als Modell Boolescher Funktionen

Kohut, Roman 05 August 2009 (has links) (PDF)
In der vorliegenden Arbeit werden die Darstellungsmöglichkeiten Boolescher Funktionen durch Neuronale Netze untersucht und eine neue Art von Booleschen Neuronalen Netzen (BNN) entwickelt. Das Basiselement Boolescher Neuronaler Netze ist ein neuartiges Boolesches Neuron (BN), das im Gegensatz zum klassischen Neuron direkt mit Booleschen Signalen operiert und dafür ausschließlich Boolesche Operationen benutzt. Für das Training der BNN wurde ein sequentieller Algorithmus erarbeitet, der eine schnelle Konvergenz garantiert und somit eine kurze Trainingzeit benötigt. Dieser Trainingsalgorithmus bildet die Grundlage eines neuen geschaffenen Verfahrens zur Architektursynthese der BNN. Das entwickelte Training stellt darüber hinaus ein spezielles Dekompositionsverfahren Boolescher Funktionen dar. Neuronale Netze können sowohl in Software als auch in Hardware realisiert werden. Der sehr hohe Aufwand der Hardware-Realisierung üblicher Neuronaler Netze wurde durch die Verwendung von BN und BNN wesentlich vereinfacht. Die Anzahl erforderlicher CLBs (configurable logic blocks) zur Realisierung eines Neurons wurde um 2 Größenordnungen verringert. Ein Boolesches Neuron wird direkt auf eine einzige LUT (lookup table) abgebildet. Für diese sehr kompakte Abbildung der BNN in eine FPGA-Struktur wurde der Trainingsalgorithmus des BNN angepasst. Durch die Spezifikation der BNN mit UML-Modellen und die Anwendung der MDA-Technologie zum Hardware/Software-Codesign konnte der Syntheseaufwand für Hardware-Realisierung von BNN signifikant verringert werden.
4

Automatic Construction of Implicative Theories for Mathematical Domains

Revenko, Artem 05 September 2016 (has links) (PDF)
Implication is a logical connective corresponding to the rule of causality "if ... then ...". Implications allow one to organize knowledge of some field of application in an intuitive and convenient manner. This thesis explores possibilities of automatic construction of all valid implications (implicative theory) in a given field. As the main method for constructing implicative theories a robust active learning technique called Attribute Exploration was used. Attribute Exploration extracts knowledge from existing data and offers a possibility of refining this knowledge via providing counter-examples. In frames of the project implicative theories were constructed automatically for two mathematical domains: algebraic identities and parametrically expressible functions. This goal was achieved thanks both pragmatical approach of Attribute Exploration and discoveries in respective fields of application. The two diverse application fields favourably illustrate different possible usage patterns of Attribute Exploration for automatic construction of implicative theories.
5

Conceptual Factors and Fuzzy Data

Glodeanu, Cynthia Vera 29 May 2013 (has links) (PDF)
With the growing number of large data sets, the necessity of complexity reduction applies today more than ever before. Moreover, some data may also be vague or uncertain. Thus, whenever we have an instrument for data analysis, the questions of how to apply complexity reduction methods and how to treat fuzzy data arise rather naturally. In this thesis, we discuss these issues for the very successful data analysis tool Formal Concept Analysis. In fact, we propose different methods for complexity reduction based on qualitative analyses, and we elaborate on various methods for handling fuzzy data. These two topics split the thesis into two parts. Data reduction is mainly dealt with in the first part of the thesis, whereas we focus on fuzzy data in the second part. Although each chapter may be read almost on its own, each one builds on and uses results from its predecessors. The main crosslink between the chapters is given by the reduction methods and fuzzy data. In particular, we will also discuss complexity reduction methods for fuzzy data, combining the two issues that motivate this thesis. / Komplexitätsreduktion ist eines der wichtigsten Verfahren in der Datenanalyse. Mit ständig wachsenden Datensätzen gilt dies heute mehr denn je. In vielen Gebieten stößt man zudem auf vage und ungewisse Daten. Wann immer man ein Instrument zur Datenanalyse hat, stellen sich daher die folgenden zwei Fragen auf eine natürliche Weise: Wie kann man im Rahmen der Analyse die Variablenanzahl verkleinern, und wie kann man Fuzzy-Daten bearbeiten? In dieser Arbeit versuchen wir die eben genannten Fragen für die Formale Begriffsanalyse zu beantworten. Genauer gesagt, erarbeiten wir verschiedene Methoden zur Komplexitätsreduktion qualitativer Daten und entwickeln diverse Verfahren für die Bearbeitung von Fuzzy-Datensätzen. Basierend auf diesen beiden Themen gliedert sich die Arbeit in zwei Teile. Im ersten Teil liegt der Schwerpunkt auf der Komplexitätsreduktion, während sich der zweite Teil der Verarbeitung von Fuzzy-Daten widmet. Die verschiedenen Kapitel sind dabei durch die beiden Themen verbunden. So werden insbesondere auch Methoden für die Komplexitätsreduktion von Fuzzy-Datensätzen entwickelt.
6

Tupel von TVL als Datenstruktur für Boolesche Funktionen

Kempe, Galina 17 December 2009 (has links) (PDF)
In der vorliegenden Arbeit wird eine Datenstruktur zur Darstellung einer Booleschen Funktion "TVL-Tupel" präsentiert, die im Ergebnis einer Kombination der bekannten Datenstrukturen Entscheidungsgraph und Ternärvektorliste entsteht. Zuerst wird untersucht, wie lokale Phasenlisten sich als Elemente des Tupels eignen. Weiterhin wird die neue Dekompositionsart ("Tupel-Dekomposition") einer Boolesche Funktion in drei bzw. vier Teilfunktionen vorgestellt. Die Besonderheit der Teilfunktionen der Dekomposition besteht in ihrer Orthogonalität zueinander. Der Vorteil der Dekomposition von Funktionen mit einer hohen Anzahl von Konjunktionen besteht im geringeren Speicherplatzbedarf. Des weiteren wurden Algorithmen für Realisierung der Operationen entwickelt, die für eine Handhabung der zerlegten Funktionen erforderlich sind. Der detaillierte Vergleich der Berechnungszeiten für die Operationen erbringt den Nachweis, dass eine Verringerung des Zeitbedarfs als Folge der Zerlegung zu erwarten ist. Weiterhin bietet die Dekomposition einen Ansatz für den Entwurf von Algorithmen, die eine parallele Bearbeitung auf der Grundlage verteilter Rechentechnik zulassen. Die Erkenntnisse der Untersuchungen der Tupel-Dekomposition einschließlich der Verwendung der verteilen Verarbeitung können beispielsweise für die Suche der Variablenmengen der OR-Bi-Decomposition verwendet werden.
7

Contributions to the 11th International Conference on Formal Concept Analysis

28 May 2013 (has links) (PDF)
Formal concept analysis (FCA) is a mathematical formalism based on order and lattice theory for data analysis. It has found applications in a broad range of neighboring fields including Semantic Web, data mining, knowledge representation, data visualization and software engineering. ICFCA is a series of annual international conferences that started in 2003 in Darmstadt and has been held in several continents: Europe, Australia, America and Africa. ICFCA has evolved to be the main forum for researchers working on theoretical or applied aspects of formal concept analysis worldwide. In 2013 the conference returned to Dresden where it was previously held in 2006. This year the selection of contributions was especially competitive. This volume is one of two volumes containing the papers presented at ICFCA 2013. The other volume is published by Springer Verlag as LNAI 7880 in its LNCS series. In addition to the regular contributions, we have included an extended abstract: Jean-Paul Doignon reviews recent results connecting formal concept analysis and knowledge space theory in his contribution “Identifiability in Knowledge Space Theory: a Survey of Recent Results”. The high-quality of the program of the conference was ensured by the much-appreciated work of the authors, the Program Committee members, and the Editorial Board members. Finally, we wish to thank the local organization team. They provided support to make ICFCA 2013 proceed smoothly in a pleasant atmosphere.
8

Deep Inference and Symmetry in Classical Proofs

Brünnler, Kai 25 August 2003 (has links) (PDF)
In this thesis we see deductive systems for classical propositional and predicate logic which use deep inference, i.e. inference rules apply arbitrarily deep inside formulas, and a certain symmetry, which provides an involution on derivations. Like sequent systems, they have a cut rule which is admissible. Unlike sequent systems, they enjoy various new interesting properties. Not only the identity axiom, but also cut, weakening and even contraction are reducible to atomic form. This leads to inference rules that are local, meaning that the effort of applying them is bounded, and finitary, meaning that, given a conclusion, there is only a finite number of premises to choose from. The systems also enjoy new normal forms for derivations and, in the propositional case, a cut elimination procedure that is drastically simpler than the ones for sequent systems.
9

Relational Exploration / Combining Description Logics and Formal Concept Analysis for Knowledge Specification

Rudolph, Sebastian 28 February 2007 (has links) (PDF)
Facing the growing amount of information in today's society, the task of specifying human knowledge in a way that can be unambiguously processed by computers becomes more and more important. Two acknowledged fields in this evolving scientific area of Knowledge Representation are Description Logics (DL) and Formal Concept Analysis (FCA). While DL concentrates on characterizing domains via logical statements and inferring knowledge from these characterizations, FCA builds conceptual hierarchies on the basis of present data. This work introduces Relational Exploration, a method for acquiring complete relational knowledge about a domain of interest by successively consulting a domain expert without ever asking redundant questions. This is achieved by combining DL and FCA: DL formalisms are used for defining FCA attributes while FCA exploration techniques are deployed to obtain or refine DL knowledge specifications.
10

Methodology for Conflict Detection and Resolution in Semantic Revision Control Systems

Hensel, Stephan, Graube, Markus, Urbas, Leon 11 November 2016 (has links) (PDF)
Revision control mechanisms are a crucial part of information systems to keep track of changes. It is one of the key requirements for industrial application of technologies like Linked Data which provides the possibility to integrate data from different systems and domains in a semantic information space. A corresponding semantic revision control system must have the same functionality as established systems (e.g. Git or Subversion). There is also a need for branching to enable parallel work on the same data or concurrent access to it. This directly introduces the requirement of supporting merges. This paper presents an approach which makes it possible to merge branches and to detect inconsistencies before creating the merged revision. We use a structural analysis of triple differences as the smallest comparison unit between the branches. The differences that are detected can be accumulated to high level changes, which is an essential step towards semantic merging. We implemented our approach as a prototypical extension of therevision control system R43ples to show proof of concept.

Page generated in 0.0176 seconds