• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 68
  • 11
  • 4
  • 1
  • 1
  • Tagged with
  • 95
  • 95
  • 95
  • 44
  • 42
  • 39
  • 30
  • 27
  • 27
  • 25
  • 21
  • 21
  • 19
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A structural study of lattices, d-lattices and some applications in data analysis / Une étude structurelle des treillis, d-treillis, et quelques applications en analyse de données

Kahn, Giacomo 12 December 2018 (has links)
Nous nous intéressons à un cadre théorique de l'analyse de données : l'analyse formelle de concepts. Le formalisme de l'analyse formelle de concepts permet d'exprimer les notions centrales de la fouille de données telles que les implications ou les ensembles fermés, avec au centre la notion de treillis qui décrit la structure et les relations que ces objets ont entre eux. Pour les données multidimensionnelles, une proposition de formalisme existe en tant que généralisation de l'analyse formelle de concepts : l'analyse polyadique de concepts. Dans cette thèse, nous étudions certains problèmes de combinatoire et d'algorithmique dans le cas de l'analyse polyadique de concepts. Nous approchons aussi un cadre plus appliqué à l'analyse de données en proposant des approches de navigation conceptuelle et de classification. / We are interested in formal concept analysis, a theoretical framework for data analysis.This formalism allows to express some central notions of data mining such as implications or closed itemsets, and is centered around lattices, as the description of the relational structure that those objects can have.For multidimensional data, a formalism exists as a generalisation of formal concept analysis : polyadic concept analysis.In this document, we study some combinatorial and algorithmic problems that arose in polyadic concept analysis.We also introduce more applied data analysis techniques of conceptual navigation and classification.
12

Linkability of communication contents : Keeping track of disclosed data using Formal Concept Analysis

Berthold, Stefan January 2006 (has links)
<p>A person who is communication about (the data subject) has to keep track of all of his revealed data in order to protect his right of informational self-determination. This is important when data is going to be processed in an automatic manner and, in particular, in case of automatic inquiries. A data subject should, therefore, be enabled to recognize useful decisions with respect to data disclosure, only by using data which is available to him.</p><p>For the scope of this thesis, we assume that a data subject is able to protect his communication contents and the corresponding communication context against a third party by using end-to-end encryption and Mix cascades. The objective is to develop a model for analyzing the linkability of communication contents by using Formal Concept Analysis. In contrast to previous work, only the knowledge of a data subject is used for this analysis instead of a global view on the entire communication contents and context.</p><p>As a first step, the relation between disclosed data is explored. It is shown how data can be grouped by types and data implications can be represented. As a second step, behavior, i. e. actions and reactions, of the data subject and his communication partners is included in this analysis in order to find critical data sets which can be used to identify the data subject.</p><p>Typical examples are used to verify this analysis, followed by a conclusion about pros and cons of this method for anonymity and linkability measurement. Results can be used, later on, in order to develop a similarity measure for human-computer interfaces.</p>
13

The Detection of Outlying Fire Service’s Reports

Krasuski, Adam, Wasilewski, Piotr 28 May 2013 (has links) (PDF)
We present a methodology for improving the detection of outlying Fire Service’s reports based on domain knowledge and dialogue with Fire & Rescue domain experts. The outlying report is considered as element which is significantly different from the remaining data. Outliers are defined and searched on the basis of domain knowledge and dialogue with experts. We face the problem of reducing high data dimensionality without loosing specificity and real complexity of reported incidents. We solve this problem by introducing a knowledge based generalization level intermediating between analysed data and experts domain knowledge. In the methodology we use the Formal Concept Analysis methods for both generation appropriate categories from data and as tools supporting communication with domain experts. We conducted two experiments in finding two types of outliers in which outliers detection was supported by domain experts.
14

Axiomatizing Confident GCIs of Finite Interpretations

Borchmann, Daniel 10 September 2012 (has links) (PDF)
Constructing description logic ontologies is a difficult task that is normally conducted by experts. Recent results show that parts of ontologies can be constructed from description logic interpretations. However, these results assume the interpretations to be free of errors, which may not be the case for real-world data. To provide some mechanism to handle these errors, the notion of confidence from data mining is introduced into description logics, yielding confident general concept inclusions (confident GCIs) of finite interpretations. The main focus of this work is to prove the existence of finite bases of confident GCIs and to describe some of theses bases explicitly.
15

Learning Terminological Knowledge with High Confidence from Erroneous Data

Borchmann, Daniel 17 September 2014 (has links) (PDF)
Description logics knowledge bases are a popular approach to represent terminological and assertional knowledge suitable for computers to work with. Despite that, the practicality of description logics is impaired by the difficulties one has to overcome to construct such knowledge bases. Previous work has addressed this issue by providing methods to learn valid terminological knowledge from data, making use of ideas from formal concept analysis. A basic assumption here is that the data is free of errors, an assumption that can in general not be made for practical applications. This thesis presents extensions of these results that allow to handle errors in the data. For this, knowledge that is "almost valid" in the data is retrieved, where the notion of "almost valid" is formalized using the notion of confidence from data mining. This thesis presents two algorithms which achieve this retrieval. The first algorithm just extracts all almost valid knowledge from the data, while the second algorithm utilizes expert interaction to distinguish errors from rare but valid counterexamples.
16

Real-time Distributed Computation of Formal Concepts and Analytics

De Alburquerque Melo, Cassio 19 July 2013 (has links) (PDF)
The advances in technology for creation, storage and dissemination of data have dramatically increased the need for tools that effectively provide users with means of identifying and understanding relevant information. Despite the great computing opportunities distributed frameworks such as Hadoop provide, it has only increased the need for means of identifying and understanding relevant information. Formal Concept Analysis (FCA) may play an important role in this context, by employing more intelligent means in the analysis process. FCA provides an intuitive understanding of generalization and specialization relationships among objects and their attributes in a structure known as a concept lattice. The present thesis addresses the problem of mining and visualising concepts over a data stream. The proposed approach is comprised of several distributed components that carry the computation of concepts from a basic transaction, filter and transforms data, stores and provides analytic features to visually explore data. The novelty of our work consists of: (i) a distributed processing and analysis architecture for mining concepts in real-time; (ii) the combination of FCA with visual analytics visualisation and exploration techniques, including association rules analytics; (iii) new algorithms for condensing and filtering conceptual data and (iv) a system that implements all proposed techniques, called Cubix, and its use cases in Biology, Complex System Design and Space Applications.
17

Expected Numbers of Proper Premises and Concept Intents

Distel, Felix, Borchmann, Daniel 17 October 2011 (has links) (PDF)
We compute the expected numbers of both formal concepts and proper premises in a formal context that is chosen uniformly at random among all formal contexts of given dimensions.
18

Query Expansion Research and Application in Search Engine Based on Concepts Lattice

Cui, Jun January 2009 (has links)
Formal concept analysis is increasingly applied to query expansion and data mining problems. In this paper I analyze and compare the current concept lattice construction algorithm, and choose iPred and Border algorithms to adapt for query expansion. After I adapt two concept lattice construction algorithms, I apply these four algorithms on one query expansion prototype system. The calculation time for four algorithms are recorded and analyzed. The result of adapted algorithms is good. Moreover I find the efficiency of concept lattice construction is not consistent with complex analysis result. In stead, it is high depend on the structure of data set, which is data source of concept lattice.
19

Linkability of communication contents : Keeping track of disclosed data using Formal Concept Analysis

Berthold, Stefan January 2006 (has links)
A person who is communication about (the data subject) has to keep track of all of his revealed data in order to protect his right of informational self-determination. This is important when data is going to be processed in an automatic manner and, in particular, in case of automatic inquiries. A data subject should, therefore, be enabled to recognize useful decisions with respect to data disclosure, only by using data which is available to him. For the scope of this thesis, we assume that a data subject is able to protect his communication contents and the corresponding communication context against a third party by using end-to-end encryption and Mix cascades. The objective is to develop a model for analyzing the linkability of communication contents by using Formal Concept Analysis. In contrast to previous work, only the knowledge of a data subject is used for this analysis instead of a global view on the entire communication contents and context. As a first step, the relation between disclosed data is explored. It is shown how data can be grouped by types and data implications can be represented. As a second step, behavior, i. e. actions and reactions, of the data subject and his communication partners is included in this analysis in order to find critical data sets which can be used to identify the data subject. Typical examples are used to verify this analysis, followed by a conclusion about pros and cons of this method for anonymity and linkability measurement. Results can be used, later on, in order to develop a similarity measure for human-computer interfaces.
20

Etude et Extraction de règles graduelles floues : définition d'algorithmes efficaces. / Survey and Extraction of Fuzzy gradual rules : Definition of Efficient algorithms

Ayouni, Sarra 09 May 2012 (has links)
L'Extraction de connaissances dans les bases de données est un processus qui vise à extraire un ensemble réduit de connaissances à fortes valeurs ajoutées à partir d'un grand volume de données. La fouille de données, l'une des étapes de ce processus, regroupe un certain nombre de taches, telles que : le clustering, la classification, l'extraction de règles d'associations, etc.La problématique d'extraction de règles d'association nécessite l'étape d'extraction de motifs fréquents. Nous distinguons plusieurs catégories de motifs : les motifs classiques, les motifs flous, les motifs graduels, les motifs séquentiels. Ces motifs diffèrent selon le type de données à partir desquelles l'extraction est faite et selon le type de corrélation qu'ils présentent.Les travaux de cette thèse s'inscrivent dans le contexte d'extraction de motifs graduels, flous et clos. En effet, nous définissons de nouveaux systèmes de clôture de la connexion de Galois relatifs, respectivement, aux motifs flous et graduels. Ainsi, nous proposons des algorithmes d'extraction d'un ensemble réduit pour les motifs graduels et les motifs flous.Nous proposons également deux approches d'extraction de motifs graduels flous, ceci en passant par la génération automatique des fonctions d'appartenance des attributs.En se basant sur les motifs flous clos et graduels clos, nous définissons des bases génériques de toutes les règles d'association graduelles et floues. Nous proposons également un système d'inférence complet et valide de toutes les règles à partir de ces bases. / Knowledge discovery in databases is a process aiming at extracting a reduced set of valuable knowledge from a huge amount of data. Data mining, one step of this process, includes a number of tasks, such as clustering, classification, of association rules mining, etc.The problem of mining association rules requires the step of frequent patterns extraction. We distinguish several categories of frequent patterns: classical patterns, fuzzy patterns, gradual patterns, sequential patterns, etc. All these patterns differ on the type of the data from which the extraction is done and the type of the relationship that represent.In this thesis, we particularly contribute with the proposal of fuzzy and gradual patterns extraction method.Indeed, we define new systems of closure of the Galois connection for, respectively, fuzzy and gradual patterns. Thus, we propose algorithms for extracting a reduced set of fuzzy and gradual patterns.We also propose two approaches for automatically defining fuzzy modalities that allow obtaining relevant fuzzy gradual patterns.Based on fuzzy closed and gradual closed patterns, we define generic bases of fuzzy and gradual association rules. We thus propose a complet and valid inference system to derive all redundant fuzzy and gradual association rules.

Page generated in 0.081 seconds