• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 48
  • 5
  • 2
  • 1
  • Tagged with
  • 56
  • 54
  • 48
  • 41
  • 41
  • 41
  • 28
  • 28
  • 28
  • 23
  • 19
  • 18
  • 14
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Relational Exploration: Combining Description Logics and Formal Concept Analysis for Knowledge Specification

Rudolph, Sebastian 01 December 2006 (has links)
Facing the growing amount of information in today's society, the task of specifying human knowledge in a way that can be unambiguously processed by computers becomes more and more important. Two acknowledged fields in this evolving scientific area of Knowledge Representation are Description Logics (DL) and Formal Concept Analysis (FCA). While DL concentrates on characterizing domains via logical statements and inferring knowledge from these characterizations, FCA builds conceptual hierarchies on the basis of present data. This work introduces Relational Exploration, a method for acquiring complete relational knowledge about a domain of interest by successively consulting a domain expert without ever asking redundant questions. This is achieved by combining DL and FCA: DL formalisms are used for defining FCA attributes while FCA exploration techniques are deployed to obtain or refine DL knowledge specifications.
22

A Connection Between Clone Theory and FCA Provided by Duality Theory

Kerkhoff, Sebastian 02 August 2012 (has links)
The aim of this paper is to show how Formal Concept Analysis can be used for the bene t of clone theory. More precisely, we show how a recently developed duality theory for clones can be used to dualize clones over bounded lattices into the framework of Formal Concept Analysis, where they can be investigated with techniques very di erent from those that universal algebraists are usually armed with. We also illustrate this approach with some small examples.
23

Conceptual Factors and Fuzzy Data

Glodeanu, Cynthia Vera 29 May 2013 (has links) (PDF)
With the growing number of large data sets, the necessity of complexity reduction applies today more than ever before. Moreover, some data may also be vague or uncertain. Thus, whenever we have an instrument for data analysis, the questions of how to apply complexity reduction methods and how to treat fuzzy data arise rather naturally. In this thesis, we discuss these issues for the very successful data analysis tool Formal Concept Analysis. In fact, we propose different methods for complexity reduction based on qualitative analyses, and we elaborate on various methods for handling fuzzy data. These two topics split the thesis into two parts. Data reduction is mainly dealt with in the first part of the thesis, whereas we focus on fuzzy data in the second part. Although each chapter may be read almost on its own, each one builds on and uses results from its predecessors. The main crosslink between the chapters is given by the reduction methods and fuzzy data. In particular, we will also discuss complexity reduction methods for fuzzy data, combining the two issues that motivate this thesis. / Komplexitätsreduktion ist eines der wichtigsten Verfahren in der Datenanalyse. Mit ständig wachsenden Datensätzen gilt dies heute mehr denn je. In vielen Gebieten stößt man zudem auf vage und ungewisse Daten. Wann immer man ein Instrument zur Datenanalyse hat, stellen sich daher die folgenden zwei Fragen auf eine natürliche Weise: Wie kann man im Rahmen der Analyse die Variablenanzahl verkleinern, und wie kann man Fuzzy-Daten bearbeiten? In dieser Arbeit versuchen wir die eben genannten Fragen für die Formale Begriffsanalyse zu beantworten. Genauer gesagt, erarbeiten wir verschiedene Methoden zur Komplexitätsreduktion qualitativer Daten und entwickeln diverse Verfahren für die Bearbeitung von Fuzzy-Datensätzen. Basierend auf diesen beiden Themen gliedert sich die Arbeit in zwei Teile. Im ersten Teil liegt der Schwerpunkt auf der Komplexitätsreduktion, während sich der zweite Teil der Verarbeitung von Fuzzy-Daten widmet. Die verschiedenen Kapitel sind dabei durch die beiden Themen verbunden. So werden insbesondere auch Methoden für die Komplexitätsreduktion von Fuzzy-Datensätzen entwickelt.
24

Identifiability in Knowledge Space Theory: a survey of recent results

Doignon, Jean-Paul 28 May 2013 (has links) (PDF)
Knowledge Space Theory (KST) links in several ways to Formal Concept Analysis (FCA). Recently, the probabilistic and statistical aspects of KST have been further developed by several authors. We review part of the recent results, and describe some of the open problems. The question of whether the outcomes can be useful in FCA remains to be investigated.
25

FCART: A New FCA-based System for Data Analysis and Knowledge Discovery

Neznanov, Alexey A., Ilvovsky, Dmitry A., Kuznetsov, Sergei O. 28 May 2013 (has links) (PDF)
We introduce a new software system called Formal Concept Analysis Research Toolbox (FCART). Our goal is to create a universal integrated environment for knowledge and data engineers. FCART is constructed upon an iterative data analysis methodology and provides a built-in set of research tools based on Formal Concept Analysis techniques for working with object-attribute data representations. The provided toolset allows for the fast integration of extensions on several levels: from internal scripts to plugins. FCART was successfully applied in several data mining and knowledge discovery tasks. Examples of applying the system in medicine and criminal investigations are considered.
26

Learning Terminological Knowledge with High Confidence from Erroneous Data

Borchmann, Daniel 09 September 2014 (has links)
Description logics knowledge bases are a popular approach to represent terminological and assertional knowledge suitable for computers to work with. Despite that, the practicality of description logics is impaired by the difficulties one has to overcome to construct such knowledge bases. Previous work has addressed this issue by providing methods to learn valid terminological knowledge from data, making use of ideas from formal concept analysis. A basic assumption here is that the data is free of errors, an assumption that can in general not be made for practical applications. This thesis presents extensions of these results that allow to handle errors in the data. For this, knowledge that is "almost valid" in the data is retrieved, where the notion of "almost valid" is formalized using the notion of confidence from data mining. This thesis presents two algorithms which achieve this retrieval. The first algorithm just extracts all almost valid knowledge from the data, while the second algorithm utilizes expert interaction to distinguish errors from rare but valid counterexamples.
27

Conceptual Factors and Fuzzy Data

Glodeanu, Cynthia Vera 20 December 2012 (has links)
With the growing number of large data sets, the necessity of complexity reduction applies today more than ever before. Moreover, some data may also be vague or uncertain. Thus, whenever we have an instrument for data analysis, the questions of how to apply complexity reduction methods and how to treat fuzzy data arise rather naturally. In this thesis, we discuss these issues for the very successful data analysis tool Formal Concept Analysis. In fact, we propose different methods for complexity reduction based on qualitative analyses, and we elaborate on various methods for handling fuzzy data. These two topics split the thesis into two parts. Data reduction is mainly dealt with in the first part of the thesis, whereas we focus on fuzzy data in the second part. Although each chapter may be read almost on its own, each one builds on and uses results from its predecessors. The main crosslink between the chapters is given by the reduction methods and fuzzy data. In particular, we will also discuss complexity reduction methods for fuzzy data, combining the two issues that motivate this thesis. / Komplexitätsreduktion ist eines der wichtigsten Verfahren in der Datenanalyse. Mit ständig wachsenden Datensätzen gilt dies heute mehr denn je. In vielen Gebieten stößt man zudem auf vage und ungewisse Daten. Wann immer man ein Instrument zur Datenanalyse hat, stellen sich daher die folgenden zwei Fragen auf eine natürliche Weise: Wie kann man im Rahmen der Analyse die Variablenanzahl verkleinern, und wie kann man Fuzzy-Daten bearbeiten? In dieser Arbeit versuchen wir die eben genannten Fragen für die Formale Begriffsanalyse zu beantworten. Genauer gesagt, erarbeiten wir verschiedene Methoden zur Komplexitätsreduktion qualitativer Daten und entwickeln diverse Verfahren für die Bearbeitung von Fuzzy-Datensätzen. Basierend auf diesen beiden Themen gliedert sich die Arbeit in zwei Teile. Im ersten Teil liegt der Schwerpunkt auf der Komplexitätsreduktion, während sich der zweite Teil der Verarbeitung von Fuzzy-Daten widmet. Die verschiedenen Kapitel sind dabei durch die beiden Themen verbunden. So werden insbesondere auch Methoden für die Komplexitätsreduktion von Fuzzy-Datensätzen entwickelt.
28

The Detection of Outlying Fire Service’s Reports: FCA Driven Analytics

Krasuski, Adam, Wasilewski, Piotr 28 May 2013 (has links)
We present a methodology for improving the detection of outlying Fire Service’s reports based on domain knowledge and dialogue with Fire & Rescue domain experts. The outlying report is considered as element which is significantly different from the remaining data. Outliers are defined and searched on the basis of domain knowledge and dialogue with experts. We face the problem of reducing high data dimensionality without loosing specificity and real complexity of reported incidents. We solve this problem by introducing a knowledge based generalization level intermediating between analysed data and experts domain knowledge. In the methodology we use the Formal Concept Analysis methods for both generation appropriate categories from data and as tools supporting communication with domain experts. We conducted two experiments in finding two types of outliers in which outliers detection was supported by domain experts.
29

FCART: A New FCA-based System for Data Analysis and Knowledge Discovery

Neznanov, Alexey A., Ilvovsky, Dmitry A., Kuznetsov, Sergei O. 28 May 2013 (has links)
We introduce a new software system called Formal Concept Analysis Research Toolbox (FCART). Our goal is to create a universal integrated environment for knowledge and data engineers. FCART is constructed upon an iterative data analysis methodology and provides a built-in set of research tools based on Formal Concept Analysis techniques for working with object-attribute data representations. The provided toolset allows for the fast integration of extensions on several levels: from internal scripts to plugins. FCART was successfully applied in several data mining and knowledge discovery tasks. Examples of applying the system in medicine and criminal investigations are considered.
30

Completing Description Logic Knowledge Bases using Formal Concept Analysis

Baader, Franz, Ganter, Bernhard, Sattler, Ulrike, Sertkaya, Barış 16 June 2022 (has links)
We propose an approach for extending both the terminological and the assertional part of a Description Logic knowledge base by using information provided by the assertional part and by a domain expert. The use of techniques from Formal Concept Analysis ensures that, on the one hand, the interaction with the expert is kept to a minimum, and, on the other hand, we can show that the extended knowledge base is complete in a certain sense.

Page generated in 0.0565 seconds