• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 147
  • 136
  • 47
  • 42
  • 15
  • 10
  • 7
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 499
  • 64
  • 50
  • 48
  • 48
  • 47
  • 46
  • 46
  • 43
  • 41
  • 40
  • 38
  • 38
  • 38
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Input handling in agent-based micro-level simulators

Fayyaz, Muhammad Saleem January 2010 (has links)
In this thesis we presented a new direction for handling missing values in multi agent-based simulation (MABS) at micro-level by using truth tables and logical relations. Although micro-level simulation is a vast field to use logical relations with truth tables to find missing values but it takes values into account at individual levels. We used databases in form of tables to extract missing values. Our literature review suggested us a method for input handling by using electronically saved truth tables. We have defined logical relations according to scenario by interacting with truth tables to find appropriate missing values. Our conclusions suggested a method which can find appropriate values for input parameters when they are missing. Accurate results have been gained according to updated database. In this thesis we have concluded that missing values would be handled in different ways, such as: Artificial neural network, K-nearest neighbor, Statistical method and Data mining; etc… These methods have not facilitated in finding appropriate missing values as we saw in literature. We have created a method that can find missing values and produce good results. We have run our method on a specific scenario to check the efficiency of input handling that motivated us to arrange database in a proper way to handle missing values along.
172

Caractérisation logique de données : application aux données biologiques / Logical Characterization of Data : application to Biological Data

Chambon, Arthur 13 December 2017 (has links)
L’analyse de groupes de données binaires est aujourd’hui un défi au vu des quantités de données collectées. Elle peut être réalisée par des approches logiques. Ces approches identifient dessous-ensembles d’attributs booléens pertinents pour caractériser les observations d’un groupe et peuvent aider l’utilisateur à mieux comprendre les propriétés de ce groupe.Cette thèse présente une approche pour caractériser des groupes de données binaires en identifiant un sous-ensemble minimal d’attributs permettant de distinguer les données de différents groupes.Nous avons défini avec précision le problème de la caractérisation multiple et proposé de nouveaux algorithmes qui peuvent être utilisés pour résoudre ses différentes variantes. Notre approche de caractérisation de données peut être étendue à la recherche de patterns (motifs) dans le cadre de l’analyse logique de données. Un pattern peut être considéré comme une explication partielle des observations positives pouvant être utilisées par les praticiens, par exemple à des fins de diagnostic. De nombreux patterns existent et plusieurs critères de préférence peuvent être ajoutés pour se concentrer sur des ensembles plus restreints (prime patterns,strong patterns,. . .). Nous proposons donc une comparaison entre ces deux méthodologies ainsi que des algorithmes pour générer des patterns. Un autre objectif est d’étudier les propriétés des solutions calculées en fonction des propriétés topologiques des instances. Des expériences sont menées sur de véritables ensembles de données biologiques. / Analysis of groups of binary data is now a challenge given the amount of collected data. It can be achieved by logical based approaches. These approaches identify subsets of relevant Boolean attributes to characterize the observations of a group and may help the user to better understand the properties of this group. This thesis presents an approach for characterizing groups of binary data by identifying a minimal subset of attributes that allows to distinguish data from different groups. We have precisely defined the multiple characterization problem and proposed new algorithms that can be used to solve its different variants. Our data characterization approach can be extended to search for patterns in the framework of logical analysis of data. A pattern can be considered as a partial explanation of the positive observations that can be used by practitioners, for instance for diagnosis purposes. Many patterns may exist and several preference criteria can be added in order to focus on more restricted sets of patterns (prime patterns, strong patterns, . . . ). We propose a comparison between these two methodologies as well as algorithms for generating patterns. The purpose is also to precisely study the properties of the solutions that are computed with regards to the topological properties of the instances. Experiments are thus conducted on real biological data.
173

Dynamics of argumentation frameworks / Dynamique des systèmes d'argumentation

Mailly, Jean-Guy 30 September 2015 (has links)
Cette thèse traite du problème de l'intégration d'une nouvelle information dans un système d'argumentation abstrait. Un tel système est un graphe orienté dont les nœuds représentent les arguments, et les arcs représentent les attaques entre arguments. Il existe divers moyen de décider quels arguments sont acceptés par l'agent qui utilise un tel système pour représenter ses croyances.Il peut arriver dans la vie d'un agent qu'il soit confronté à une information du type "tel argument devrait être accepté", alors que c'est en contradiction avec ses croyances actuelles, représentées par son système d'argumentation.Nous avons étudié dans cette thèse diverses approches pour intégrer une information à un système d'argumentation.Notre première contribution est une adaptation du cadre AGM pour la révision de croyances, habituellement utilisé lorsque les croyances de l'agent sont représentées dans un formalisme logique. Nous avons notamment adapté les postulats de rationalité proposés dans le cadre AGM pour pouvoir caractériser des opérateurs de révision de systèmes d'argumentation, et nous avons proposé différents moyens de générer les systèmes d'argumentation résultant de la révision.Nous avons ensuite proposé d'utiliser la révision AGM comme un outil pour réviser les systèmes d'argumentation. Il s'agit cette fois-ci d'une approche par encodage en logique du système d'argumentation, qui permet d'utiliser les opérateurs de révision usuels pour obtenir le résultat souhaité.Enfin, nous avons étudié le problème du forçage d'un ensemble d'arguments (comment modifier le système pour qu'un ensemble donné soit une extension). Nous avons proposé une nouvelle famille d'opérateurs qui garantissent le succès de l'opération, contrairement aux opérateurs de forçage existants, et nous avons montré qu'une traduction de nos approches en problèmes de satisfaction ou d'optimisation booléenne permet de développer des outils efficaces pour calculer le résultat du forçage. / This thesis tackles the problem of integrating a new piece of information in an abstract argumentation framework. Such a framework is a directed graph such that its nodes represent the arguments, and the directed edges represent the attacks between arguments. There are different ways to decide which arguments are accepted by the agent who uses such a framework to represent her beliefs.An agent may be confronted with a piece of information such that "this argument should be accepted", which is in contradiction with her current beliefs, represented by her argumentation framework.In this thesis, we have studied several approaches to incorporate a piece of information in an argumentation framework.Our first contribution is an adaptation of the AGM framework for belief revision, which has been developed for characterizing the incorporation of a new piece of information when the agent's beliefs are represented in a logical setting. We have adapted the rationality postulates from the AGM framework to characterize the revision operators suited to argumentation frameworks, and we have identified several ways to generate the argumentation frameworks resulting from the revision.We have also shown how to use AGM revision as a tool for revising argumentation frameworks. Our approach uses a logical encoding of the argumentation framework to take advantage of the classical revision operators, for deriving the expected result.At last, we have studied the problem of enforcing a set of arguments (how to change an argumentation framework so that a given set of arguments becomes an extension). We have developed a new family of operators which guarantee the success of the enforcement process, contrary to the existing approaches, and we have shown that a translation of our approaches into satisfaction and optimization problems makes possible to develop efficient tools for computing the result of the enforcement.
174

A Model for Managing Data Integrity

Mallur, Vikram January 2011 (has links)
Consistent, accurate and timely data are essential to the functioning of a modern organization. Managing the integrity of an organization’s data assets in a systematic manner is a challenging task in the face of continuous update, transformation and processing to support business operations. Classic approaches to constraint-based integrity focus on logical consistency within a database and reject any transaction that violates consistency, but leave unresolved how to fix or manage violations. More ad hoc approaches focus on the accuracy of the data and attempt to clean data assets after the fact, using queries to flag records with potential violations and using manual efforts to repair. Neither approach satisfactorily addresses the problem from an organizational point of view. In this thesis, we provide a conceptual model of constraint-based integrity management (CBIM) that flexibly combines both approaches in a systematic manner to provide improved integrity management. We perform a gap analysis that examines the criteria that are desirable for efficient management of data integrity. Our approach involves creating a Data Integrity Zone and an On Deck Zone in the database for separating the clean data from data that violates integrity constraints. We provide tool support for specifying constraints in a tabular form and generating triggers that flag violations of dependencies. We validate this by performing case studies on two systems used to manage healthcare data: PAL-IS and iMED-Learn. Our case studies show that using views to implement the zones does not cause any significant increase in the running time of a process.
175

The Logic of Hereditary Harrop Formulas as a Specification Logic for Hybrid

Battell, Chelsea January 2016 (has links)
Hybrid is a two-level logical framework that supports higher-order abstract syntax (HOAS), where a specification logic (SL) extends the class of object logics (OLs) we can reason about. We develop a new Hybrid SL and formalize its metatheory, proving weakening, contraction, exchange, and cut admissibility; results that greatly simplify reasoning about OLs in systems providing HOAS. The SL is a sequent calculus defined as an inductive type in Coq and we prove properties by structural induction over SL sequents. We also present a generalized SL and metatheory statement, allowing us to prove many cases of such theorems in a general way and understand how to identify and prove the difficult cases. We make a concrete and measurable improvement to Hybrid with the new SL formalization and provide a technique for abstracting such proofs, leading to a condensed presentation, greater understanding, and a generalization that may be instantiated to other logics.
176

Analytical logical effort formulation for local sizing / Formulação analítica baseada em logical effort para dimensionamento local

Alegretti, Caio Graco Prates January 2013 (has links)
A indústria de microeletrônica tem recorrido cada vez mais à metodologia de projeto baseado em células para fazer frente à crescente complexidade dos projetos de circuitos integrados digitais, uma vez que circuitos baseados em células são projetados mais rápida e economicamente que circuitos full-custom. Entretanto, apesar do progresso ocorrido na área de Electronic Design Automation, circuitos digitais baseados em células apresentam desempenho inferior ao de circuitos full-custom. Assim, torna-se interessante encontrar maneiras de se fazer com que circuitos baseados em células tenham desempenho próximo ao de circuitos full-custom, sem que isso implique elevação significativa nos custos do projeto. Com tal objetivo em vista, esta tese apresenta contribuições para um fluxo automático de otimização local para circuitos digitais baseados em células. Por otimização local se entende a otimização do circuito em pequenas janelas de contexto, onde são feitas otimizações considerando o contexto global. Deste modo, a otimização local pode incluir a detecção e isolamento de regiões críticas do circuito e a geração de redes lógicas e de redes de transistores de diferentes topologias que são dimensionadas de acordo com as restrições de projeto em questão. Como as otimizações locais atuam em um contexto reduzido, várias soluções podem ser obtidas considerando as restrições locais, entre as quais se escolhe a mais adequada para substituir o subcircuito (região crítica) original. A contribuição específica desta tese é o desenvolvimento de um método de dimensionamento de subcircuitos capaz de obter soluções com área ativa mínima, respeitando a capacitância máxima de entrada, a carga a ser acionada, e a restrição de atraso imposta. O método é baseado em uma formulação de logical effort, e a principal contribuição é calcular analiticamente a derivada da área para obter área mínima, ao invés de fazer a derivada do atraso para obter o atraso mínimo, como é feito na formulação tradicional do logical effort. Simulações elétricas mostram que o modelo proposto é muito preciso para uma abordagem de primeira ordem, uma vez que apresenta erros médios de 1,48% para dissipação de potência, 2,28% para atraso de propagação e 6,5% para os tamanhos dos transistores. / Microelectronics industry has been relying more and more upon cell-based design methodology to face the growing complexity in the design of digital integrated circuits, since cell-based integrated circuits are designed in a faster and cheaper way than fullcustom circuits. Nevertheless, in spite of the advancements in the field of Electronic Design Automation, cell-based digital integrated circuits show inferior performance when compared with full-custom circuits. Therefore, it is desirable to find ways to bring the performance of cell-based circuits closer to that of full-custom circuits without compromising the design costs of the former circuits. Bearing this goal in mind, this thesis presents contributions towards an automatic flow of local optimization for cellbased digital circuits. By local optimization, it is meant circuit optimization within small context windows, in which optimizations are done taking into account the global context. This way, local optimization may include the detection and isolation of critical regions of the circuit and the generation of logic and transistor networks; these networks are sized according to the existing design constraints. Since local optimizations act in a reduced context, several solutions may be obtained considering local constraints, out of which the fittest solution is chosen to replace the original subcircuit (critical region). The specific contribution of this thesis is the development of a subcircuit sizing method capable of obtaining minimum active area solutions, taking into account the maximum input capacitance, the output load to be driven, and the imposed delay constraint. The method is based on the logical effort formulation, and the main contribution is to compute the area derivative to obtain minimum area, instead of making the delay derivative to obtain minimum delay, as it is done in the traditional logical effort formulation. Electrical simulations show that the proposed method is very precise for a first order approach, as it presents average errors of 1.48% in power dissipation, 2.28% in propagation delay, and 6.5% in transistor sizes.
177

A teleologia especulativa de Hegel : vida logica e vida do espirito / Hegel's speculative teleology : logical life and life of the mind

Silva, Marcia Zebina Araujo da 12 November 2006 (has links)
Orientador: Marcos Lutz Muller / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Filosofia e Ciencias Humanas / Made available in DSpace on 2018-08-07T22:50:44Z (GMT). No. of bitstreams: 1 Silva_MarciaZebinaAraujoda_D.pdf: 1959658 bytes, checksum: efff3fd419d03e8a0755e4aa836671f5 (MD5) Previous issue date: 2006 / Resumo: O trabalho é uma análise do conceito de teleologia na Ciência da Lógica de Hegel, com o intuito de esclarecer as diferenciações entre teleologia externa e interna em sua correlação com o espírito, ou seja, esclarecer o percurso teleológico da Lógica Subjetiva até a Idéia Absoluta. O conceito de vida, a finalidade interna, está presente na Ciência da Lógica como idéia lógica da vida, que é a idéia imediata. O acabamento da vida lógica é a emergência da vida do espírito, enquanto idéia lógica do espírito, que é exposta na Idéia do Conhecer, o momento intermediário da idéia. O fio condutor da análise, portanto, é o duplo movimento teleológico, externo e interno, que culmina com a Idéia Absoluta, o pensamento de pensamento, cuja matriz estrutural é a idéia lógica da vida / Abstract: This thesis comprises an analysis of the concept of teleology in the Hegel¿s Science of the Logic, intended as a clarification of the differentiations between external and internal teleology in its correlation with the spirit, that is, to clarify the teleological path from the Subjective Logic all the way to the Absolute Idea. The concept of life, of internal intention, is present in the Science of the Logic as logical idea of life, which is the immediate idea. The terminus of the logical life is the emergence of the life of the mind, as a logical idea of the mind, which is presented in the Idea of Knowing, the intermediate moment of the idea. Thus, the conducting thread of analysis is the double teleological movement, external and internal, which culminates with the Absolute Idea, the thought of 'thought¿, whose structural matrix is the logical idea of the life / Doutorado / Historia da Filosofia Contemporanea / Doutor em Filosofia
178

Formally certified satisfiability solving

Oe, Duck Ki 01 July 2012 (has links)
Satisfiability (SAT) and satisfiability modulo theories (SMT) solvers are high-performance automated propositional and first-order theorem provers, used as underlying tools in many formal verification and artificial intelligence systems. Theoretic and engineering advancement of solver technologies improved the performance of modern solvers; however, the increased complexity of those solvers calls for formal verification of those tools themselves. This thesis discusses two methods to formally certify SAT/SMT solvers. The first method is generating proofs from solvers and certifying those proofs. Because new theories are constantly added to SMT solvers, a flexible framework to safely add new inference rules is necessary. The proposal is to use a meta-language called LFSC, which is based on Edinburgh Logical Framework. SAT/SMT logics have been encoded in LFSC, and the encoding can be easily and modularly extended for new logics. It is shown that an optimized LFSC checker can certify SMT proofs efficiently. The second method is using a verified programming language to implement a SAT solver and verify the code statically. Guru is a pure functional programming language with support for dependent types and theorem proving; Guru also allows for efficient code generation by means of resource typing. A modern SAT solver, called versat, has been implemented and verified to be correct in Guru. The performance of versat is shown to be comparable with that of the current proof checking technology.
179

Využití nástrojů projektového managementu v praxi / The Use of Methods of the Project Management in Company

Kubala, Matej January 2017 (has links)
This master's thesis focuses on using project management methods in implementation of Microsoft SharePoint 2013 product and creating intranet portal in Promokov a.s. At the beginning, thesis maps and describes actual situation by SLEPT analysis, SWOT analysis and analytic workshop made directly at the customer site. Last part is dedicated to the realization of recommended change.
180

Využití nástrojů projektového managementu v praxi / The Use of Methods of the Project Management in Practice

Straka, Nikolas January 2021 (has links)
The diploma thesis deals with the use of project management methods in practice, specifically the project design for the deployment of a suitable device for a mobile solution of a selected company. The introductory part is devoted to the theoretical basis, which are then used in the design part. The analytical part is focused on analyzing the project methodology of the selected company. The design part contains the actual design of the project phase of the project.

Page generated in 0.0608 seconds