Spelling suggestions: "subject:"deductive""
1 |
A re-examination of Hegel's Science of LogicJohnson, P. O. January 1987 (has links)
No description available.
|
2 |
Simplifying transformations for type-alpha certificatesArkoudas, Konstantine 13 November 2001 (has links)
This paper presents an algorithm for simplifying NDL deductions. An array of simplifying transformations are rigorously defined. They are shown to be terminating, and to respect the formal semantis of the language. We also show that the transformations never increase the size or complexity of a deduction---in the worst case, they produce deductions of the same size and complexity as the original. We present several examples of proofs containing various types of "detours", and explain how our procedure eliminates them, resulting in smaller and cleaner deductions. All of the given transformations are fully implemented in SML-NJ. The complete code listing is presented, along with explanatory comments. Finally, although the transformations given here are defined for NDL, we point out that they can be applied to any type-alpha DPL that satisfies a few simple conditions.
|
3 |
The closure of knowledge in contextBarke, Antonia C. J. January 1999 (has links)
No description available.
|
4 |
Type-omega DPLsArkoudas, Konstantine 16 October 2001 (has links)
Type-omega DPLs (Denotational Proof Languages) are languages for proof presentation and search that offer strong soundness guarantees. LCF-type systems such as HOL offer similar guarantees, but their soundness relies heavily on static type systems. By contrast, DPLs ensure soundness dynamically, through their evaluation semantics; no type system is necessary. This is possible owing to a novel two-tier syntax that separates deductions from computations, and to the abstraction of assumption bases, which is factored into the semantics of the language and allows for sound evaluation. Every type-omega DPL properly contains a type-alpha DPL, which can be used to present proofs in a lucid and detailed form, exclusively in terms of primitive inference rules. Derived inference rules are expressed as user-defined methods, which are "proof recipes" that take arguments and dynamically perform appropriate deductions. Methods arise naturally via parametric abstraction over type-alpha proofs. In that light, the evaluation of a method call can be viewed as a computation that carries out a type-alpha deduction. The type-alpha proof "unwound" by such a method call is called the "certificate" of the call. Certificates can be checked by exceptionally simple type-alpha interpreters, and thus they are useful whenever we wish to minimize our trusted base. Methods are statically closed over lexical environments, but dynamically scoped over assumption bases. They can take other methods as arguments, they can iterate, and they can branch conditionally. These capabilities, in tandem with the bifurcated syntax of type-omega DPLs and their dynamic assumption-base semantics, allow the user to define methods in a style that is disciplined enough to ensure soundness yet fluid enough to permit succinct and perspicuous expression of arbitrarily sophisticated derived inference rules. We demonstrate every major feature of type-omega DPLs by defining and studying NDL-omega, a higher-order, lexically scoped, call-by-value type-omega DPL for classical zero-order natural deduction---a simple choice that allows us to focus on type-omega syntax and semantics rather than on the subtleties of the underlying logic. We start by illustrating how type-alpha DPLs naturally lead to type-omega DPLs by way of abstraction; present the formal syntax and semantics of NDL-omega; prove several results about it, including soundness; give numerous examples of methods; point out connections to the lambda-phi calculus, a very general framework for type-omega DPLs; introduce a notion of computational and deductive cost; define several instrumented interpreters for computing such costs and for generating certificates; explore the use of type-omega DPLs as general programming languages; show that DPLs do not have to be type-less by formulating a static Hindley-Milner polymorphic type system for NDL-omega; discuss some idiosyncrasies of type-omega DPLs such as the potential divergence of proof checking; and compare type-omega DPLs to other approaches to proof presentation and discovery. Finally, a complete implementation of NDL-omega in SML-NJ is given for users who want to run the examples and experiment with the language.
|
5 |
Herr Kant, der Alleszermalmer-Kant the "All-Crushing" Destroyer of Metaphysics: Metaphilosophy of the Critique of Pure ReasonDe Backer, Jake 18 May 2015 (has links)
The Critique of Pure Reason inaugurated Kant’s Critical Philosophy. Commentators commonly distinguish between Kant’s Positive Project (PP), that is, his epistemology as laid out in the Transcendental Aesthetic and Transcendental Analytic, from his Negative Project (NP), expressed in terms of the destructive implications his epistemology has on speculative metaphysics and rational theology. Against this tradition I will argue that the whole of the Critique is largely a negative-destructive enterprise. I will focus on what is commonly taken as the centerpiece of the PP, that is, the Transcendental Deduction, and demonstrate that even here the NP is given normative priority. Though, to be sure, certain passages tend to encourage an interpretation of the PP as primary, I contend that this view is myopic and fails to pay sufficient attention to Kant’s global concerns in the Critique. I will demonstrate that a clear exposition of Kant’s metaphilosophical aims, commitments, and convictions is in fact corrosive to any such reading. The objective of this thesis, then, is two-fold: 1) to provide an account of Kant’s metaphilosophy in the Critique, and 2) to argue for what I will here and elsewhere refer to as the Primacy of the Negative Thesis, that is, that Kant prioritized boundary-setting over principle-generating.
|
6 |
Using Normal Deduction Graphs in Common Sense ReasoningMunoz, Ricardo A. (Ricardo Alberto) 05 1900 (has links)
This investigation proposes a powerful formalization of common sense knowledge based on function-free normal deduction graphs (NDGs) which form a powerful tool for deriving Horn and non-Horn clauses without functions. Such formalization allows common sense reasoning since it has the ability to handle not only negative but also incomplete information.
|
7 |
The Use of Equalization Filters to Achieve High Common Mode Rejection Ratios in Biopotential Amplifier ArraysXia, Hongfang 12 May 2005 (has links)
Recently, it became possible to detect single motor units (MUs) noninvasively via the use of spatial filtering electrode arrays. With these arrays, weighted combinations of monopolar electrode signals recorded from the skin surface provide spatial selectivity of the underlying electrical activity. Common spatial filters include the bipolar electrode, the longitude double differentiating (LDD) filter and the normal double differentiating (NDD) filter. In general, the spatial filtering is implemented in hardware and the performance of the spatial filtering apparatus is measured by its common mode rejection ratio (CMRR). High precision hardware differential amplifiers are used to perform the channel weighting in order to achieve high CMRR. But, this hardware is expensive and all channel weightings must be predetermined. Hence, only a few spatially filtered channels are typically derived. In this project, a distinct software equalization filter was cascaded with each of the hardware monopolar signal conditioning circuits to achieve accurate weighting and high CMRR. The simplest technique we explored was to design an equalization filter by dividing the frequency response of a“reference" (or“ideal") channel by the measured frequency response of the channel being equalized, producing the desired equalization filter in the frequency domain (conventional technique). Simulation and experimental results showed that the conventional technique is very sensitive to broadband background noise, producing poor CMRR. Thus, a technique for signal denoising that is based on signal mixing was pursued and evaluated both in simulation and laboratory experiments. The purpose of the mixing technique is to eliminate the noise as much as possible prior to equalization filter design. The simulation results show that without software equalization, CMRR is only around 30 dB; with conventional technique CMRR is around 50~60 dB. By using mixing technique, CMRR can be around 70~80 dB.
|
8 |
O tribunal da razão: um estudo histórico e sistemático sobre as metáforas jurídicas na crítica da razão pura / The tribunal of reason: a historical and systematic study of the legal metaphors in the critique of pure reasonTrevisan, Diego Kosbiau 09 November 2015 (has links)
O presente trabalho é uma investigação histórica, genética e sistemática sobre as metáforas jurídicas da Crítica da Razão Pura, contidas, implicita ou explicitamente, na imagem do tribunal da crítica como tribunal da razão. O objetivo de fundo é analisar como a terminologia jurídica e a jurisprudência influíram na estrutura metodológica da Crítica da Razão Pura e em seu surgimento no desenrolar da tradição filosófica da modernidade e, de forma mais específica, no desenvolvimento do pensamento do próprio Kant. De modo a investigar como as múltiplas metáforas jurídicas da Crítica da Razão Pura apontam, todas, para uma origem metodológica jurídica do criticismo, o presente trabalho divide-se em três grandes partes, cada uma delas abordando um dos três aspectos interpretativos mencionados, a saber, o histórico, o genético e o sistemático. Na primeira parte é apresentada a história das fontes e dos conceitos determinantes para a compreensão jurídica da Crítica, a saber, a ideia de um tribunal e legislação da razão e os conceitos jurídicos de dedução e de antinomia. Na segunda parte é esboçada uma reconstrução da história de surgimento da filosofia crítica cujo objetivo é ressaltar a gênese de alguns motivos críticos que dizem diretamente respeito à constituição jurídica da Crítica e se ligam à preocupação metodológica nela envolvida, a saber, a representação de um juiz que julga imparcialmente sobre as pretensões de conhecimento e a ideia de uma nomotética da razão pura. Na terceira e última parte do trabalho empreende-se uma análise sistemática da metafórica jurídica da Crítica cujas raízes históricas e genéticas foram reveladas anteriormente. Mediante a interpretação da Disciplina da Razão Pura, da dedução metafísica e transcendental como procedimento jurídico de justificação de pretensões além da resolução da antinomia como pressuposto para a validade da legislação da razão, mostra-se como a investigação sistemática das metáforas jurídicas da Crítica da Razão Pura permite compreender a filosofia crítica como a exposição da legislação negativa e ao mesmo tempo positiva da razão. / This thesis provides a historical, genetic and systematic study of the legal metaphors in the Critique of Pure Reason, which are, implicitly or explicitily, contained in the image of the Tribunal of the Critique as the Tribunal of Reason. The main purpose of this work is to examine how legal terminology and jurisprudence influenced the methodological framework of the Critique of Pure Reason. Furthermore, this study seeks to address how these elements played a role in the emergence of the Critique in the course of the philosophical tradition of modernity and, more specifically, in the development of Kants thought. In order to investigate how the legal metaphors in the Critique of Pure Reason indicate a legal methodological origin of criticism, this work is divided into three parts. Each one of these parts addresses one of the three interpretative aspects mentioned above, namely, the historical, genetic and systematic. The first part deals with a history of the sources and of the concepts which underlie the legal understanding of criticism, namely, the idea of a tribunal and legislation of reason as well as the legal concepts of deduction and antinomy. The second part reconstructs the emergence of the critical philosophy. In this part, the goal is to highlight the genesis of some critical motives which have a bearing on the legal constitution of the Critique and express its methodological concern. More specifically, it addresses both the representation of a judge which reaches an impartial verdict on the pretensions of knowledge and the idea of a nomothetic of pure reason. The third and final part of the work undertakes a systematic analysis of the legal metaphors in the Critique relying on the historical and genetic roots described in the previous parts. It provides an interpretation of the Discipline of Pure Reason as the methodological core of the Critique, of the metaphysical and transcendental deduction as a legal procedure and of the resolution of the antinomy as a precondition for the validity of the legislation of reason. By doing so, this part shows how the systematic investigation of the legal metaphors of the Critique of Pure Reason allows for the understanding of the critical philosophy as the exposition of the negative and positive legislation of reason.
|
9 |
Roots and Role of the Imagination in Kant: Imagination at the CoreThompson, Michael 09 March 2009 (has links)
Kant's critical philosophy promises to overturn both Empiricism and Rationalism by arguing for the necessity of a passive faculty, sensibility, and an active faculty, understanding, in order for cognition to obtain. Kant argues in favor of sense impression found in standard empirical philosophies while advocating conceptual necessities like those found in rational philosophies. It is only in the synthesis of these two elements that cognition and knowledge claims are possible. However, by affirming such a dualism, Kant has created yet another problem familiar to the history of philosophy, one of faculty interaction. By affirming two separate and exclusive capacities necessary for cognition, Kant has bridged the gap between the two philosophical traditions, but created a gap that must be overcome in order to affirm his positive programmatic. Kant himself realizes the difficulty his new philosophy faces when he claims the two sources of knowledge must have a "common, but unknown root." To complete Kant's program one must ask: "What bridges the gap between sensible intuition and conceptual understanding?"
In my dissertation, I turn to Kant's philosophy and find the answer to this question in the productive imagination. In order to evaluate the viability of this answer, I problematize the imagination as it has been found in the history of Western philosophy. By tracing the historical use of the imagination in archetypal figures from both empiricist and rationalist traditions, one finds a development of imagination that culminates in the fundamental formulation found in Kant's Critique of Pure Reason. In his critical philosophy, Kant synthesizes the imagination (Einbildungskraft) and the use of imagination found in both traditions, thus demonstrating its role in both sensation and understanding. By employing the imagination at both sensorial and conceptual levels, Kant has found, I argue, the liaison that overcomes the dualism established by his requirements for knowledge, as well as the common root for both.
|
10 |
Use of yeast species as the biocomponent for priority environmental contaminants biosensor devicesGurazada, Saroja January 2008 (has links)
Along with an increasing understanding of the harmful effects on the environment of a wide range of pollutants has come the need for more sensitive, faster and less expensive detection methods of identification and quantitation. Many environmental pollutants occur in low levels and often in complex matrices thus analysis can be difficult, time consuming and costly. Because of the availability and easy cultivation of the microorganisms with potentially high specificity, there is considerable interest in the use of living microorganisms as the analytical component (the biocomponent) of sensors for pollutants. While a number of biosensors using bacteria have been developed, yeast has been comparatively rarely used as the biocomponent. Yeast are attractive because they are easy to culture and they are eukaryotes which means their biochemistry is in many respects closer to that of higher organisms. This thesis describes the development of whole cell bioassays that use yeast cells as a sensing element and redox mediators to probe the intracellular redox reactions to monitor the catabolic activity of the yeast resulting from the external substrate, steady-state voltammetry is utilised as the electrochemical detection technique. The isogenic differential enzyme analysis (IDEA) concept of Lincoln Ventures Limited, lead NERF funded research consortium uses bacteria that have been cultured using specific organic pollutants as the carbon source which are the biocomponent in sensors. The use of wild type yeast Arxula adeninivorans that has the ability to use a very wide variety of substrates as sources of carbon and nitrogen was used as an alternative to bacteria to validate the “IDEA” concept. Naphthalene and di-butyl phthalate were chosen as model target contaminant molecules. The performance, detection limits and the usefulness of yeast based biosensor applications for environmental analysis are discussed. This thesis also describes the development and optimisation of a simple, cost effective in vivo estrogens bioassay for the detection of estrogens using either genetically modified or a wild type yeast Saccharomyces cerevisiae. In this study, catabolic repression by glucose was exploited to achieve specificity to estrogens in complex environmental samples that eliminates the requirement for conventional sample preparation. This is the first time that the use of wild type yeast to quantify estrogens has been reported. The attractive features of the bioassay are its use of a non-GMO organism, its speed, its high specificity and sensitivity with a detection limit of 10-15 M. The similarity of binding affinities for major estrogens to those of human estrogens receptors makes this in vivo estrogen bioassay very useful for analytical/screening procedures. The electrochemical detection method also makes it easy to interface with a variety of electronic devices.
|
Page generated in 0.0831 seconds