• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 68
  • 11
  • 4
  • 1
  • 1
  • Tagged with
  • 95
  • 95
  • 95
  • 44
  • 42
  • 39
  • 30
  • 27
  • 27
  • 25
  • 21
  • 21
  • 19
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Visualizing Data With Formal Concept Analysis

Diner, Casri 01 September 2003 (has links) (PDF)
In this thesis, we wanted to stress the tendency to the geometry of data. This should be applicable in almost every branch of science, where data are of great importance, and also in every kind of industry, economy, medicine etc. Since machine&#039 / s hard-disk capacities which is used for storing datas and the amount of data you can reach through internet is increasing day by day, there should be a need to turn this information into knowledge. This is one of the reasons for studying formal concept analysis. We wanted to point out how this application is related with algebra and logic. The beginning of the first chapter emphasis the relation between closure systems, galois connections, lattice theory as a mathematical structure and concept analysis. Then it describes the basic step in the formalization: An elementary form of the representation of data is defined mathematically. Second chapter explains the logic of formal concept analysis. It also shows how implications, which can be regard as special formulas on a set,between attributes can be shown by fewer implications, so called generating set for implications. These mathematical tools are then used in the last chapter, in order to describe complex &#039 / concept&#039 / lattices by means of decomposition methods in examples.
2

Linking information resources with automatic semantic extraction

Joseph, Daniel January 2016 (has links)
Knowledge is a critical dimension in the problem solving processes of human intelligence. Consequently, enabling intelligent systems to provide advanced services requires that their artificial intelligence routines have access to knowledge of relevant domains. Ontologies are often utilised as the formal conceptualisation of domains, in that they identify and model the concepts and relationships of the targeted domain. However complexities inherent in ontology development and maintenance have limited their availability. Separate from the conceptualisation component, domain knowledge also encompasses the concept membership of object instances within the domain. The need to capture both the domain model and the current state of instances within the domain has motivated the import of Formal Concept Analysis into intelligent systems research. Formal Concept Analysis, which provides a simplified model of a domain, has the advantage in that not only does it define concepts in terms of their attribute description but object instances are simultaneously ascribed to their appropriate concepts. Nonetheless, a significant drawback of Formal Concept Analysis is that when applied to a large dataset, the lattice with which it models a domain is often composed of a copious amount of concepts, many of which are arguably unnecessary or invalid. In this research a novel measure is introduced which assigns a relevance value to concepts in the lattice. This measure is termed the Collapse Index and is based on the minimum number of object instances that need be removed from a domain in order for a concept to be expunged from the lattice. Mathematics that underpin its origin and behaviour are detailed in the thesis showing that if the relevance of a concept is defined by the Collapse Index: a concept will eventually lose relevance if one of its immediate subconcepts increasingly acquires object instance support; and a concept has its highest relevance when its immediate subconcepts have equal or near equal object instance support. In addition, experimental evaluation is provided where the Collapse Index demonstrated comparable or better performance than the current prominent alternatives in: being consistent across samples; the ability to recall concepts in noisy lattices; and efficiency of calculation. It is also demonstrated that the Collapse Index affords concepts with low object instance support the opportunity to have a higher relevance than those of high supportThe second contribution to knowledge is that of an approach to semantic extraction from a dataset where the Collapse Index is included as a method of selecting concepts for inclusion in a final concept hierarchy. The utility of the approach is demonstrated by reviewing its inclusion in the implementation of a recommender system. This recommender system serves as the final contribution featuring a unique design where lattices represent user profiles and concepts in these profiles are pruned using the Collapse Index. Results showed that pruning of profile lattices enabled by the Collapse Index improved the success levels of movie recommendations if the appropriate thresholds are set.
3

A Preparatory Study Towards a Body of Knowledge in the Field of Formal Methods for the Railway Domain

Kumar, Apurva 11 1900 (has links)
Bodies or Books of Knowledge (BoKs) have only been transcribed in mature fields where practices and rules have been well established (settled) and are gathered for any prospective or current practitioner to refer to. As a precursor to creating a BoK, it is first important to know if the domain contains settled knowledge and how this knowledge can be isolated? One approach, as described in this work, is to use Formal Concept Analysis (FCA) to structure the knowledge (or parts of it) and construct a pruned concept lattice to highlight patterns of use and filter out the common and established practices that best suit the solving of a problem within the domain. In the railway domain, formal methods have been applied for a number of years to solve various modelling and verification problems. Their common use and straightforward application (with some refinement) makes them easy to identify and therefore a prime candidate to test for settled knowledge within the railway domain. They also provide other assurances of settled knowledge along the way. / Thesis / Master of Applied Science (MASc)
4

Formal Concept Analysis for Search and Traversal in Multiple Databases with Effective Revision

Sinha, Aditya January 2009 (has links)
No description available.
5

A combinatorial approach to scientific exploration of gene expression data: An integrative method using Formal Concept Analysis for the comparative analysis of microarray data

Potter, Dustin Paul 14 October 2005 (has links)
Functional genetics is the study of the genes present in a genome of an organism, the complex interplay of all genes and their environment being the primary focus of study. The motivation for such studies is the premise that gene expression patterns in a cell are characteristic of its current state. The availability of the entire genome for many organisms now allows scientists unparalleled opportunities to characterize, classify, and manipulate genes or gene networks involved in metabolism, cellular differentiation, development, and disease. System-wide studies of biological systems have been made possible by the advent of high-throughput and large-scale tools such as microarrays which are capable of measuring the mRNA levels of all genes in a genome. Tools and methods for the integration, visualization, and modeling of the large-scale data obtained in typical systems biology experiments are indispensable. Our work focuses on a method that integrates gene expression values obtained from microarray experiments with biological functional information related to the genes measured in order to make global comparisons of multiple experiments. In our method, the integrated data is represented as a lattice and, using appropriate measures, a reference experiment can be compared to samples from a database of similar experiments, and a ranking of similarity is returned. In this work, support for the validity of our method is demonstrated both theoretically and empirically: a mathematical description of the lattice structure with respect to the integrated information is developed and the method is applied to data sets of both simulated and reported microarray experiments. A fast algorithm for constructing the lattice representation is also developed. / Ph. D.
6

Contributions to the 11th International Conference on Formal Concept Analysis

28 May 2013 (has links) (PDF)
Formal concept analysis (FCA) is a mathematical formalism based on order and lattice theory for data analysis. It has found applications in a broad range of neighboring fields including Semantic Web, data mining, knowledge representation, data visualization and software engineering. ICFCA is a series of annual international conferences that started in 2003 in Darmstadt and has been held in several continents: Europe, Australia, America and Africa. ICFCA has evolved to be the main forum for researchers working on theoretical or applied aspects of formal concept analysis worldwide. In 2013 the conference returned to Dresden where it was previously held in 2006. This year the selection of contributions was especially competitive. This volume is one of two volumes containing the papers presented at ICFCA 2013. The other volume is published by Springer Verlag as LNAI 7880 in its LNCS series. In addition to the regular contributions, we have included an extended abstract: Jean-Paul Doignon reviews recent results connecting formal concept analysis and knowledge space theory in his contribution “Identifiability in Knowledge Space Theory: a Survey of Recent Results”. The high-quality of the program of the conference was ensured by the much-appreciated work of the authors, the Program Committee members, and the Editorial Board members. Finally, we wish to thank the local organization team. They provided support to make ICFCA 2013 proceed smoothly in a pleasant atmosphere.
7

Contributions to the 11th International Conference on Formal Concept Analysis: Dresden, Germany, May 21–24, 2013

Cellier, Peggy, Distel, Felix, Ganter, Bernhard 28 May 2013 (has links)
Formal concept analysis (FCA) is a mathematical formalism based on order and lattice theory for data analysis. It has found applications in a broad range of neighboring fields including Semantic Web, data mining, knowledge representation, data visualization and software engineering. ICFCA is a series of annual international conferences that started in 2003 in Darmstadt and has been held in several continents: Europe, Australia, America and Africa. ICFCA has evolved to be the main forum for researchers working on theoretical or applied aspects of formal concept analysis worldwide. In 2013 the conference returned to Dresden where it was previously held in 2006. This year the selection of contributions was especially competitive. This volume is one of two volumes containing the papers presented at ICFCA 2013. The other volume is published by Springer Verlag as LNAI 7880 in its LNCS series. In addition to the regular contributions, we have included an extended abstract: Jean-Paul Doignon reviews recent results connecting formal concept analysis and knowledge space theory in his contribution “Identifiability in Knowledge Space Theory: a Survey of Recent Results”. The high-quality of the program of the conference was ensured by the much-appreciated work of the authors, the Program Committee members, and the Editorial Board members. Finally, we wish to thank the local organization team. They provided support to make ICFCA 2013 proceed smoothly in a pleasant atmosphere.:EXTENDED ABSTRACT Jean-Paul Doignon: Identifiability in Knowledge Space Theory: a survey of recent results S. 1 REGULAR CONTRIBUTIONS Ľubomír Antoni, Stanislav Krajči, Ondrej Krídlo and Lenka Pisková: Heterogeneous environment on examples S. 5 Robert Jäschke and Sebastian Rudolph: Attribute Exploration on the Web S. 19 Adam Krasuski and Piotr Wasilewski: The Detection of Outlying Fire Service’s Reports. The FCA Driven Analytics S. 35 Xenia Naidenova and Vladimir Parkhomenko: An Approach to Incremental Learning Based on Good Classification Tests S. 51 Alexey A. Neznanov, Dmitry A. Ilvovsky and Sergei O. Kuznetsov: FCART: A New FCA-based System for Data Analysis and Knowledge Discovery S. 65
8

Dolování textu na úrovni diskursu / Mining texts at the discourse level

Van de Moosdijk, Sara Francisca January 2014 (has links)
Linguistic discourse refers to the meaning of larger text segments, and could be very useful for guiding attempts at text mining such as document selection or summarization. The aim of this project is to apply discourse information to Knowledge Discovery in Databases. As far as we know, this is the first attempt at combining these two very different fields, so the goal is to create a basis for this type of knowledge extraction. We approach the problem by extracting discourse relations using unsupervised methods, and then model the data using pattern structures in Formal Concept Analysis. Our method is applied to a corpus of medical articles compiled from PubMed. This medical data can be further enhanced with concepts from the UMLS MetaThesaurus, which are combined with the UMLS Semantic Network to apply as an ontology in the pattern structures. The results show that despite having a large amount of noise, the method is promising and could be applied to domains other than the medical domain. Powered by TCPDF (www.tcpdf.org)
9

A Formal Concept Analysis Approach to Association Rule Mining: The QuICL Algorithms

Smith, David T. 01 January 2009 (has links)
Association rule mining (ARM) is the task of identifying meaningful implication rules exhibited in a data set. Most research has focused on extracting frequent item (FI) sets and thus fallen short of the overall ARM objective. The FI miners fail to identify the upper covers that are needed to generate a set of association rules whose size can be exploited by an end user. An alternative to FI mining can be found in formal concept analysis (FCA), a branch of applied mathematics. FCA derives a concept lattice whose concepts identify closed FI sets and connections identify the upper covers. However, most FCA algorithms construct a complete lattice and therefore include item sets that are not frequent. An iceberg lattice, on the other hand, is a concept lattice whose concepts contain only FI sets. Only three algorithms to construct an iceberg lattice were found in literature. Given that an iceberg concept lattice provides an analysis tool to succinctly identify association rules, this study investigated additional algorithms to construct an iceberg concept lattice. This report presents the development and analysis of the Quick Iceberg Concept Lattice (QuICL) algorithms. These algorithms provide incremental construction of an iceberg lattice. QuICL uses recursion instead of iteration to navigate the lattice and establish connections, thereby eliminating costly processing incurred by past algorithms. The QuICL algorithms were evaluated against leading FI miners and FCA construction algorithms using benchmarks cited in literature. Results demonstrate that QuICL provides performance on the order of FI miners yet additionally derive the upper covers. QuICL, when combined with known algorithms to extract a basis of association rules from a lattice, offer a "best known" ARM solution. Beyond this, the QuICL algorithms have proved to be very efficient, providing an order of magnitude gains over other incremental lattice construction algorithms. For example, on the Mushroom data set, QuICL completes in less than 3 seconds. Past algorithms exceed 200 seconds. On T10I4D100k, QuICL completes in less than 120 seconds. Past algorithms approach 10,000 seconds. QuICL is proved to be the "best known" all around incremental lattice construction algorithm. Runtime complexity is shown to be O(l d i) where l is the cardinality of the lattice, d is the average degree of the lattice, and i is a mean function on the frequent item extents.
10

Génération de Transformations de Modèles : une approche basée sur les treillis de Galois / Model Transformation Generation : a Galois Lattices approach

Dolques, Xavier 18 November 2010 (has links)
La transformation de modèles est une opération fondamentale dans l'ingénierie dirigée par les modèles. Elle peut être manuelle ou automatisée, mais dans ce dernier cas elle nécessite de la part du développeur qui la conçoit la maîtrise des méta-modèles impliqués dans la transformation. La génération de transformations de modèles à partir d'exemples permet la création d'une transformation de modèle en se basant sur des exemples de modèles sources et cibles. Le fait de travailler au niveau modèle permet d'utiliser les syntaxes concrètes définies pour les méta-modèles et ne nécessite plus une maîtrise parfaite de ces derniers.Nous proposons une méthode de génération de transformations de modèles à partir d'exemples basée sur l'Analyse Relationnelle de Concepts (ARC) permettant d'obtenir un ensemble de règles de transformations ordonnées sous forme de treillis. L'ARC est une méthode de classification qui se base sur des liens de correspondances entre les modèles pour faire émerger des règles. Ces liens étant un problème commun à toute les méthodes de génération de transformation de modèles à partir d'exemples, nous proposons une méthode basée sur des méthodes d'alignement d'ontologie permettant de les générer. / Model transformation is a fundamental operation for Model Driven Engineering. It can be performed manually or automatically, but in the later cas the developper needs to master all the meta-models involved. Model Transformation generation from examples allows to create a model transformation based on source models examples and target models exemples. Working at the model level allows the use of concrete syntaxes defined for the meta-models so there is no more need for the developper to perfectly know them.We propose a method to generate model transformations from examples using Relational Concept Analysis (RCA) which provides a set of transformation rules ordered under the structure of a lattice. RCA is a classification method based on matching links between models to extract rules. Those matching are a common feature between the model transformation generation from examples methods, so we propose a method based on an ontology matching approach to generate them.

Page generated in 0.1112 seconds