• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 50
  • 21
  • 7
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 108
  • 43
  • 19
  • 16
  • 15
  • 13
  • 13
  • 13
  • 13
  • 12
  • 12
  • 12
  • 11
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Enhancing SAT-based Formal Verification Methods using Global Learning

Arora, Rajat 25 May 2004 (has links)
With the advances in VLSI and System-On-Chip (SOC) technology, the complexity of hardware systems has increased manifold. Today, 70% of the design cost is spent in verifying these intricate systems. The two most widely used formal methods for design verification are Equivalence Checking and Model Checking. Equivalence Checking requires that the implementation circuit should be exactly equivalent to the specification circuit (golden model). In other words, for each possible input pattern, the implementation circuit should yield the same outputs as the specification circuit. Model checking, on the other hand, checks to see if the design holds certain properties, which in turn are indispensable for the proper functionality of the design. Complexities in both Equivalence Checking and Model Checking are exponential to the circuit size. In this thesis, we firstly propose a novel technique to improve SAT-based Combinational Equivalence Checking (CEC) and Bounded Model Checking (BMC). The idea is to perform a low-cost preprocessing that will statically induce global signal relationships into the original CNF formula of the circuit under verification and hence reduce the complexity of the SAT instance. This efficient and effective preprocessing quickly builds up the implication graph for the circuit under verification, yielding a large set of logic implications composed of direct, indirect and extended backward implications. These two-node implications (spanning time-frame boundaries) are converted into two-literal clauses, and added to the original CNF database. The added clauses constrain the search space of the SAT-solver engine, and provide correlation among the different variables, which enhances the Boolean Constraint Propagation (BCP). Experimental results on large and difficult ISCAS'85, ISCAS'89 (full scan) and ITC'99 (full scan) CEC instances and ISCAS'89 BMC instances show that our approach is independent of the state-of-the-art SAT-solver used, and that the added clauses help to achieve more than an order of magnitude speedup over the conventional approach. Also, comparison with Hyper-Resolution [Bacchus 03] suggests that our technique is much more powerful, yielding non-trivial clauses that significantly simplify the SAT instance complexity. Secondly, we propose a novel global learning technique that helps to identify highly non-trivial relationships among signals in the circuit netlist, thereby boosting the power of the existing implication engine. We call this new class of implications as 'extended forward implications', and show its effectiveness through additional untestable faults they help to identify. Thirdly, we propose a suite of lemmas and theorems to formalize global learning. We show through implementation that these theorems help to significantly simplify a generic CNF formula (from Formal Verification, Artificial Intelligence etc.) by identifying the necessary assignments, equivalent signals, complementary signals and other non-trivial implication relationships among its variables. We further illustrate through experimental results that the CNF formula simplification obtained using our tool outshines the simplification obtained using other preprocessors. / Master of Science
52

Proof systems for propositional modal logic

Van der Vyver, Thelma 11 1900 (has links)
In classical propositional logic (CPL) logical reasoning is formalised as logical entailment and can be computed by means of tableau and resolution proof procedures. Unfortunately CPL is not expressive enough and using first order logic (FOL) does not solve the problem either since proof procedures for these logics are not decidable. Modal propositional logics (MPL) on the other hand are both decidable and more expressive than CPL. It therefore seems reasonable to apply tableau and resolution proof systems to MPL in order to compute logical entailment in MPL. Although some of the principles in CPL are present in MPL, there are complexities in MPL that are not present in CPL. Tableau and resolution proof systems which address these issues and others will be surveyed here. In particular the work of Abadi & Manna (1986), Chan (1987), del Cerro & Herzig (1988), Fitting (1983, 1990) and Gore (1995) will be reviewed. / Computing / M. Sc. (Computer Science)
53

Proof systems for propositional modal logic

Van der Vyver, Thelma 11 1900 (has links)
In classical propositional logic (CPL) logical reasoning is formalised as logical entailment and can be computed by means of tableau and resolution proof procedures. Unfortunately CPL is not expressive enough and using first order logic (FOL) does not solve the problem either since proof procedures for these logics are not decidable. Modal propositional logics (MPL) on the other hand are both decidable and more expressive than CPL. It therefore seems reasonable to apply tableau and resolution proof systems to MPL in order to compute logical entailment in MPL. Although some of the principles in CPL are present in MPL, there are complexities in MPL that are not present in CPL. Tableau and resolution proof systems which address these issues and others will be surveyed here. In particular the work of Abadi & Manna (1986), Chan (1987), del Cerro & Herzig (1988), Fitting (1983, 1990) and Gore (1995) will be reviewed. / Computing / M. Sc. (Computer Science)
54

On describing

Schoubye, Anders Johan January 2011 (has links)
The overarching topic of this dissertation is the semantics and pragmatics of definite descriptions. It focuses on the question whether sentences such as ‘the king of France is bald’ literally assert the existence of a unique king (and therefore are false) or simply presuppose the existence of such a king (and thus fail to express propositions). One immediate obstacle to resolving this question is that immediate truth value judgments about such sentences (sentences with non-denoting descriptions) are particularly unstable; some elicit a clear intuition of falsity whereas others simply seem awkward or strange. Because of these variations, truth value judgments are generally considered unreliable. In the first chapter of the dissertation, an explanation of this phenomenon is developed. It is observed that when these types of sentences are considered in the context of a discourse, a systematic pattern in judgments emerges. This pattern, it is argued, should be explained in terms of certain pragmatic factors, e.g. whether a speaker’s utterance is interpreted as cooperative. A detailed and general explanation of the phenomenon is then presented which draws importantly on recent research in the semantics and pragmatics of questions and focus. It is shown that the behavior of these judgments can be systematically explained, that truth value judgments are not as unreliable as standardly assumed, and that the proposed explanation best supports the conclusion that definite descriptions presuppose rather than assert existence. In the second chapter, the following problem is investigated. If definite descriptions are assumed to literally assert existence, a sentence such as ‘Hans wants the ghost in his attic to be quiet’ is incorrectly predicted to be true only if Hans wants there to be a (unique) ghost in his attic. This prediction is often considered evidence against Russell’s quantificational analysis and evidence in favor of the referential analysis of Frege and Strawson. Against this claim, it is demonstrated that this problem is a general problem about the existence commitments of natural language determiners, i.e. not an argument in favor of a referential analysis. It is shown that in order to avoid these undesirable predictions, quite radical changes to the semantic framework are required. For example, it must be assumed that a sentence of the form ‘The F is G’ has the open sentence ‘x is G’ as its asserted content. A uniform quantificational and presuppositional analysis of definites and indefinites is outlined which by exploiting certain features of so-called dynamic semantics unproblematically assumes that the asserted contents indeed are open sentences. In view of the proposed quantificational/presuppositional analysis, the dissertation is concluded by a rejection of the argument put forward by Reimer (1998) and Devitt (2004) that definite descriptions are ambiguous between attributive (quantificational) and referential (indexical) uses. Reimer and Devitt’s argument is (in contrast to Donnellan, 1966) based primarily on the assumption that definite descriptions are conventionally used to communicate singular thoughts and that the conventional meaning of a definite description therefore must be fundamentally indexical/directly referential. I argue that this argument relies crucially on tacit assumptions about semantic processing for which no empirical evidence is provided. I also argue that the argument is too general; if sound, it would be an argument for an indexical treatment of most, if not all, other determiners. I then conclude by demonstrating that the view does not explain any new data and thus has no clear motivation. In short, this dissertation provides a detailed pragmatic explanation of a long-standing puzzle about truth value judgments and then outlines a novel dynamic semantic analysis of definites and indefinites. This analysis solves a significant problem about existence commitments — a problem that neither Russell’s nor the Frege/Strawson analysis are equipped to handle. This analysis is then defended against the claim that definite descriptions are ambiguous.
55

論唯物消除論 / A Critique of Elimanative Materialism

邱盛揚, Chiu, Sen Yang Unknown Date (has links)
本篇論文的主要目的是要藉由批判Paul Churchland版本的唯物消除論當中的論證之相關蘊含以及前提,來分析其立場的缺失與不足之處。而為了要達到這個目的,我便將本論文的構成主軸分成兩個部分:第一個部分是由第一章以及第二章所構成,其主要談論過去Churchland (1981)所宣稱的消除論立場以及其論證架構,在這裡除了整理過去眾多哲學家對於他的論證的一些批判與反駁的觀點之外,我也提出自己對於過去消除論之三個主要困境的主張。第二個部分則是由第三章以及第四章所構成,其主要是針對Churchland (2007)現在的唯物消除論之新進觀點來展開論述。當中除了分析他消除論立場的一致性之外,同時也去分析並討論他所提供之支持消除論的新論證,我將之稱作為「動物認知論證」(animal cognition argument) 。我將指出,由於Churchland在該論證當中隱含了一個重要的預設,因而使得該新論證的途徑未必可以合理地推得出其原有的消除論立場。
56

Analyticity, Necessity and Belief : Aspects of two-dimensional semantics

Johannesson, Eric January 2017 (has links)
A glass couldn't contain water unless it contained H2O-molecules. Likewise, a man couldn't be a bachelor unless he was unmarried. Now, the latter is what we would call a conceptual or analytical truth. It's also what we would call a priori. But it's hardly a conceptual or analytical truth that if a glass contains water, then it contains H2O-molecules. Neither is it a priori. The fact that water is composed of H2O-molecules was an empirical discovery made in the eighteenth century. The fact that all bachelors are unmarried was not. But neither is a logical truth, so how do we explain the difference? Two-dimensional semantics is a framework that promises to shed light on these issues. The main purpose of this thesis is to understand and evaluate this framework in relation to various alternatives, to see whether some version of it can be defended. I argue that it fares better than the alternatives. However, much criticism of two-dimensionalism has focused on its alleged inability to provide a proper semantics for certain epistemic operators, in particular the belief operator and the a priori operator. In response to this criticism, a two-dimensional semantics for belief ascriptions is developed using structured propositions. In connection with this, a number of other issues in the semantics of belief ascriptions are addressed, concerning indexicals, beliefs de se, beliefs de re, and the problem of logical omniscience.
57

Computational Aspects of Learning, Reasoning, and Deciding

Zanuttini, Bruno 27 June 2011 (has links) (PDF)
We present results and research projects about the computational aspects of classical problems in Artificial Intelligence. We are interested in the setting of agents able to describe their environment through a possibly huge number of Boolean descriptors, and to act upon this environment. The typical applications of this kind of studies are to the design of autonomous robots (for exploring unknown zones, for instance) or of software assistants (for scheduling, for instance). The ultimate goal of research in this domain is the design of agents able to learn autonomously, by learning and interacting with their environment (including human users), also able to reason for producing new pieces of knowledge, for explaining observed phenomena, and finally, able to decide on which action to take at any moment, in a rational fashion. Ideally, such agents will be fast, efficient as soon as they start to interact with their environment, they will improve their behavior as time goes by, and they will be able to communicate naturally with humans. Among the numerous research questions raised by these objectives, we are especially interested in concept and preference learning, in reinforcement learning, in planning, and in some underlying problems in complexity theory. A particular attention is paid to interaction with humans and to huge numbers of descriptors of the environment, as are necessary in real-world applications.
58

Semântica proposicional categórica

Ferreira, Rodrigo Costa 01 December 2010 (has links)
Made available in DSpace on 2015-05-14T12:11:59Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 891353 bytes, checksum: 2d056c7f53fdfb7c20586b64874e848d (MD5) Previous issue date: 2010-12-01 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / The basic concepts of what later became called category theory were introduced in 1945 by Samuel Eilenberg and Saunders Mac Lane. In 1940s, the main applications were originally in the fields of algebraic topology and algebraic abstract. During the 1950s and 1960s, this theory became an important conceptual framework in other many areas of mathematical research, especially in algrebraic homology and algebraic geometry, as shows the works of Daniel M. Kan (1958) and Alexander Grothendieck (1957). Late, questions mathematiclogics about the category theory appears, in particularly, with the publication of the Functorial Semantics of Algebraic Theories (1963) of Francis Willian Lawvere. After, other works are done in the category logic, such as the the current Makkai (1977), Borceux (1994), Goldblatt (2006), and others. As introduction of application of the category theory in logic, this work presents a study on the logic category propositional. The first section of this work, shows to the reader the important concepts to a better understanding of subject: (a) basic components of category theory: categorical constructions, definitions, axiomatic, applications, authors, etc.; (b) certain structures of abstract algebra: monoids, groups, Boolean algebras, etc.; (c) some concepts of mathematical logic: pre-order, partial orderind, equivalence relation, Lindenbaum algebra, etc. The second section, it talk about the properties, structures and relations of category propositional logic. In that section, we interpret the logical connectives of the negation, conjunction, disjunction and implication, as well the Boolean connectives of complement, intersection and union, in the categorical language. Finally, we define a categorical boolean propositional semantics through a Boolean category algebra. / Os conceitos básicos do que mais tarde seria chamado de teoria das categorias são introduzidos no artigo General Theory of Natural Equivalences (1945) de Samuel Eilenberg e Saunders Mac Lane. Já em meados da década de 1940, esta teoria é aplicada com sucesso ao campo da topologia. Ao longo das décadas de 1950 e 1960, a teoria das categorias ostenta importantes mudanças ao enfoque tradicional de diversas áreas da matemática, entre as quais, em especial, a álgebra geométrica e a álgebra homológica, como atestam os pioneiros trabalhos de Daniel M. Kan (1958) e Alexander Grothendieck (1957). Mais tarde, questões lógico-matemáticas emergem em meio a essa teoria, em particular, com a publica ção da Functorial Semantics of Algebraic Theories (1963) de Francis Willian Lawvere. Desde então, diversos outros trabalhos vêm sendo realizados em lógica categórica, como os mais recentes Makkai (1977), Borceux (1994), Goldblatt (2006), entre outros. Como inicialização à aplicação da teoria das categorias à lógica, a presente dissertação aduz um estudo introdutório à lógica proposicional categórica. Em linhas gerais, a primeira parte deste trabalho procura familiarizar o leitor com os conceitos básicos à pesquisa do tema: (a) elementos constitutivos da teoria das categorias : axiomática, construções, aplicações, autores, etc.; (b) algumas estruturas da álgebra abstrata: monóides, grupos, álgebra de Boole, etc.; (c) determinados conceitos da lógica matemática: pré-ordem; ordem parcial; equivalência, álgebra de Lindenbaum, etc. A segunda parte, trata da aproximação da teoria das categorias à lógica proposicional, isto é, investiga as propriedades, estruturas e relações próprias à lógica proposicional categórica. Nesta passagem, há uma reinterpreta ção dos conectivos lógicos da negação, conjunção, disjunção e implicação, bem como dos conectivos booleanos de complemento, interseção e união, em termos categóricos. Na seqüência, estas novas concepções permitem enunciar uma álgebra booleana categórica, por meio da qual, ao final, é construída uma semântica proposicional booleana categórica.
59

CONFIABILISMO EM ALVIN GOLDMAN / RELIABILITY IN ALVIN GOLDMAN

Rodrigues, Emanuele Abreu 26 August 2009 (has links)
We assume that the central focus of epistemology is propositional knowledge (S knows that P). However, since some true beliefs are true by accident, the central question raised by epistemologists is: What becomes a mere true belief into knowledge? There are several answers to this question, many of them conflicting with each other. Among the answers we find two perspectives that compete with each other as the necessary and sufficient conditions for knowledge, namely, the internalist and externalist perspectives. For the epistemological externalism mind the external factors in the formation of belief. The research aims to discuss some issues that connect externalism a proper way of thinking about truth and what we do when we take a belief to be true. The theoretical discussion will use the externalist perspective of Alvin Goldman seeking a normative theory of justification, assuming that a belief is caused by a reliable process. Goldman, for example, states that the explanation of justified belief is necessary for knowledge and is closely related to it. Asserts that the term "justified" is an evaluative term and any correct definition or synonym for "justified" would also be an evaluative term. Thus, Goldman seeks a normative theory of justification for such a search to specify the conditions for substantive epistemic belief. However, he said conditions should be a non-epistemic, that is, necessary and sufficient conditions that do not involve any epistemic notions. Goldman complains that most of the time it is assumed that someone has a justified belief because that person knows that the belief is justified and know what is the justification. This means that justification is an argument or a set of reasons that can be given in favor of a belief, but it just tells us that the nature of justified belief with regard to what a person might say if asked to defend or justify belief. Instead, Goldman thinks that a belief is justified only by some process or property that makes it justified. / Partiremos do pressuposto que o foco central da epistemologia é o conhecimento proposicional (S sabe que P). Entretanto, uma vez que algumas crenças verdadeiras são verdadeiras por acaso, a questão central formulada pelos epistemólogos é a seguinte: O que converte a mera crença verdadeira em conhecimento? Existem diversas respostas para essa questão, muitas delas conflitantes entre si. Entre as respostas encontramos duas perspectivas que competem entre si quanto às condições necessárias e suficientes para o conhecimento, a saber, as perspectivas internalista e externalista. Para o externalismo epistemológico importam os fatores externos na formação da crença. A pesquisa procura discutir algumas questões que conectam o externalismo a uma adequada maneira de pensar sobre a verdade e o que fazemos quando tomamos uma crença como sendo verdadeira. Como referencial teórico utilizaremos a perspectiva externalista de Alvin Goldman que busca uma teoria normativa da justificação, pressupondo que uma crença é originada por um processo confiável. Goldman, por exemplo, afirma que a explicação da crença justificada é necessária para o conhecimento e está intimamente relacionada a ele. Assevera que o termo justificada é um termo valorativo e qualquer definição correta ou sinônimo de justificada seria também um termo valorativo. Assim, Goldman busca uma teoria normativa da justificação, para tal procura especificar as condições substantivas para a crença epistêmica. Contudo, afirma que tais condições deverão ser condições não epistêmicas, isto é, condições necessárias e suficientes que não envolvem quaisquer noções epistêmicas. Goldman critica que na maioria das vezes se assume que alguém tem uma crença justificada porque essa pessoa sabe que a crença é justificada e sabe qual é a justificação. Isso significa dizer que a justificação é um argumento ou um conjunto de razões que podem ser dadas a favor de uma crença, mas isso simplesmente nos diz que a natureza da crença justificada diz respeito ao que uma pessoa poderia dizer se fosse solicitada a defender ou justificar sua crença. Ao contrário, Goldman pensa que uma crença só é justificada através de algum processo ou propriedade que a torna justificada.
60

Sources and application of professional knowledge amongst teacher educators

Lefoka, Pulane Julia 10 October 2011 (has links)
In Lesotho, there are no formal opportunities for professional training of teacher educators. Consequently, the majority of teacher educators have not received a training that could equip them with professional knowledge base that is foundational to any profession. Therefore the question: what are the sources and application of professional knowledge among teacher educators appeared justifiable. Arguably, the teacher educators’ professional knowledge is intricately linked to education practice. Teacher educators have to address the discrepancy between education policy and practice through the training of student teachers who, in turn, have to contribute to the quality of the Lesotho education system. An interpretivist approach was followed in undertaking this study. Data was collected through: narratives, observations of teacher educators and analysis of the curriculum and assessment documents. The unit of analysis was eight teacher educators who are based at the National University of Lesotho’s Faculty of Education. Verification of the extent to which the topic was researchable was through undertaking a pilot study with six teacher educators who were based in the department of Educational Foundations in the same faculty. The analysis of the data revealed an immersion in the teacher educators’ professional landscape provides them ample opportunities to learn from an array of experiences. They accumulated experienced-based professional knowledge relevant to their world of work as they learn to teach, construct, apply and model it in the context that is uniquely teacher education. They have learned to teach teachers mainly from existing education practices which perpetuate what already exists. They face numerous challenges; their teaching is biased towards conventional teaching techniques of a transmissive nature and to a less extent interactive techniques; construction of professional knowledge remains a complex and challenging undertaking. Opportunities to construct own teaching research-based knowledge and supervision of student research are limited. In practice teacher educators have to rethink their pedagogy. Engaging in research adopting a “self-study” approach is unavoidable. Research will enhance their professional development and the quality of the student teachers. / Thesis (PhD)--University of Pretoria, 2011. / Humanities Education / unrestricted

Page generated in 0.0931 seconds