• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2523
  • 1006
  • 245
  • 206
  • 205
  • 186
  • 76
  • 52
  • 40
  • 40
  • 40
  • 40
  • 40
  • 39
  • 28
  • Tagged with
  • 5569
  • 1507
  • 719
  • 659
  • 497
  • 484
  • 421
  • 388
  • 374
  • 365
  • 365
  • 356
  • 355
  • 350
  • 341
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

A Logical Basis for Reasoning with Default Rules

Cassano, Valentin 11 1900 (has links)
This thesis is an investigation into the foundations of reasoning with default rules as presented by Reiter in his seminal 1980 article: `A Logic for Default Reasoning'. In being such, it opens up with a critical appraisal of the logical underpinnings of Reiter's presentation of the main elements of reasoning with default rules. More precisely, following Reiter's presentation, it discusses the concept of a default rule in comparison with that of a rule of inference, the concept of an extension in comparison with that of a theory, and the concept of `being a consequence of' for reasoning with default rules. Contrary to the commonly perceived view, the argument put forth is that such a context does not provide sensible logical foundation for reasoning with default rules. As a result, this thesis argues for an alternative interpretation to what is captured by default rules, what is captured by extensions, and what `being a consequence of' for reasoning with default rules amounts to. In particular, it proposes to treat default rules as premiss-like objects standing for assertions made tentatively, to treat extensions as interpretation structures of a syntactical kind, and to bring the concept of `being a consequence of' for reasoning with default rules into the foreground by formulating a suitable notion of an entailment relation and its ensuing logical system. Accounting for the fact that in any logical system it is important to have at hand mechanisms for formulating proofs and for structuring large theories, this thesis presents a tableaux based proof calculus for reasoning with default rules and it explores some mappings notions related to the structuring of default presentations, i.e., presentations in the context of reasoning with default rules. / Thesis / Doctor of Philosophy (PhD) / This thesis is an investigation into the foundations of reasoning with default rules as presented by Reiter in his seminal 1980 article: `A Logic for Default Reasoning'. A first very general problem definition for this Ph.D. thesis is raised by the following question: Can reasoning with default rules, as presented in Reiter's seminal 1980 article: `A Logic for Default Reasoning', be understood as a logic for non-monotonic reasoning? This thesis presents a rationale for the formulation of such a question and a possible answer for it. On the more technical side, this thesis presents a proof calculus for a particular formulation of a logic for reasoning with default rules, as well as some mapping concepts for structuring presentations defined on this logic.
112

Designing Six Variable Combination Logic Circuits with the TI-59

Ashford, Brian M. 01 July 1981 (has links) (PDF)
A program has been written for the Texas Instrument's TI-59 hand-held calculator implementing the Quine-McCluskey minimization method for logic circuit design. This program is contained on multiple magnetic cards and provides the user with the capability for combinational logic minimization of circuit design problems containing up to six variables.
113

A Logic Simulator Interface

Lofgren, John D. 01 January 1985 (has links) (PDF)
A software interface between a firmware documentation system and a logic simulator named TEGAS-51 is described. The interface accepts PALASM2 inputs for PAL files. The output is an ASCII file which defines the firmware parts in TEGAS-5 format. Modules are written in FORTRAN and command routines are written in DCL on VAX 11/780 machines. No system calls are required, so portability is maintained. Limitations include the inability to load two different programs in identical firmware parts on the same design, but this can be overcome. 1GE/Calma Corporation trademark 2MMI Corporation trademark
114

Dialectic as a method of inquiry

St. Clair, Donald D. January 1961 (has links)
Call number: LD2668 .T4 1961 S25
115

Decision theory and counterfactual evaluation

Orri Stefansson, Hlynur January 2014 (has links)
The value of actual outcomes or states of affairs often depends on what could have been. Such dependencies create well-known “paradoxes” for decision theory, the best-known perhaps being the so-called Allais Paradox. The primary aim of this PhD thesis is to enrich decision theory such that it includes counterfactual prospects in the domains of desirability (or utility) functions, and show that, as a result, the paradoxes in question disappear. Before discussing the way in which counterfactual propositions influence the desirability of actual outcomes, I discuss the way in which the truth of one factual proposition influences the desirability of another. This examination leads me to reject the Invariance assumption, which states that the desirability of a proposition is independent of whether it is true. The assumption plays an important role in David Lewis’ famous arguments against the so-called Desire-as-Belief thesis (DAB). The unsoundness of Lewis’ argument does of course not make DAB true. In fact, I provide novel arguments against different versions of DAB, without assuming Invariance. To justify the assumptions I make when extending decision theory to counterfactual prospects, I discuss several issues concerning the logic, metaphysics and epistemology of counterfactuals. For instance, I defend a version of the so-called Ramsey test, and show that Richard Bradley’s recent Multidimensional Possible World Semantics for Conditionals is both more plausible and permissive than Bradley’s original formulation of it suggested. I use the multidimensional semantics to extend Richard Jeffrey’s decision theory to counterfactuals, and show that his desirability measure, extended to counterfactuals, can represent the various different ways in which counterfactuals influence the desirability of factual propositions. And I explain why the most common alternatives to Jeffrey’s theory cannot be similarly extended. I conclude the thesis by using Jeffrey’s extended decision theory to construct an ethical theory I call Modal Consequentialism, and argue that it better satisfies certain entrenched moral intuitions than Non-Modal Consequentialism (such as classical utilitarianism and welfare economics).
116

Advances on a methodology of design and engineering in economics and political science

Morett, Fernando January 2014 (has links)
This thesis consists of five chapters: 1.The Mechanical View, 2.Social Machines, 3.The FCC Auction Machine, 4.Self-Interested Knaves, and 5.Self-Interested but Sympathetic. In the first three chapters, I advance a methodological account of current design and engineering in economics and political science, which I call methodological mechanicism. It is not ontological or literal; it relies on a technological metaphor by describing market and state institutions as machines, and the human mind as consisting of a number of mechanisms. I introduce the Mechanical view on scientific theories as distinct from the Syntactic and the Semantic views. The electromagnetic theories from the nineteenth century are used to illustrate this view as well as the use of minimal and maximal analogies in model-building in normal and revolutionary science. The Mechanical view is extended to the social sciences, particularly to mechanism design theory and institutional design, using the International Monetary Fund, the NHS internal markets and the FCC auction as examples. Their blueprints are evaluated using criteria such as shielding and power for calculating joint effects as well as libertarian, dirigiste, egalitarian and inegalitarian properties; and the holistic and piecemeal engineering they adopt. Experimental parameter variation is introduced as a method complementing design. Any design assumes a particular moral psychology, so in chapters four and five I argue that the moral psychology of universal self-interest from Bernard Mandeville, and the related ideas on design and engineering, should be chosen over the moral psychology of self-interest, sympathy and sentiments of humanity from David Hume. Hume finds no solution for knavery in politics and civil society. He accepts egalitarianism as useful and consistent with utilitarian principles; however he rejects it because of some difficulties with its implementation. I show how those difficulties may be overcome, and I explain why his objections are unbalanced and not sufficiently justified.
117

The analysis of design methods by a comparative study of award-winning industrial architecture (1970-1990)

Holness, Nelson Anthony January 2000 (has links)
This thesis advances a proposition that designers of exemplary industrial buildings adopt design heuristics based on evolving conceptual prototypes. The detail design priorities and formal expressions may vary, but the essential structure of their design solution search patterns should show a high degree of commonality. This commonality is based on the balance between rational and intuitive strategies for design decision making. Therefore it is the central objective of this thesis to demonstrate that a pattern between these two cognitive approaches to design thinking occurs across the cases investigated. The research was organised as a comparative study of the design process used by a selection of designers to design industrial buildings which subsequently received design awards for the quality of their design. Through a series of six case studies using semi-structured interviews and follow-up repertory grid analysis, data was obtained which allowed the various design methods to be recorded, analysed and compared. A control sample of designers of non award winning industrial buildings was also conducted as a comparison to validate the classification of designers on the basis of design quality on the grounds of design awards. The thesis will propose that each designer resorted to a constantly evolving genenc 'prototype' for the design of industrial buildings which helps reduce the search area for a design solution. Therefore relatively little time is required to analyse the problem at the beginning of the process before a suitable solution is formulated. The prototype allows the designers to quickly identify aspects of the design that might prove problematic and hence require specific attention. The strategy is then one of rational analysis to solve specific problems rather than an extensive rationally based development of the whole design. Hence the supporting role of rationally based techniques during the design process. Finally, the familiarity that comes from using a design method based on a constantly evolving prototype provides the designer with greater 'outcome certainty', in that they have an increased likelihood of achieving a successful design as the potential of the prototype is known to the designer at the commencement of the project
118

The implementation and applications of multiple-valued logic

Clarke, Christopher T. January 1993 (has links)
No description available.
119

Ecological management of the Sussex South Downs : applications of GIS and landscape ecology

Burnside, Niall George January 2000 (has links)
The South Downs Area of Outstanding Natural Beauty is a nationally important conservation area, which contains a significant proportion (28%) of the South East calcareous grassland resource. The traditional calcareous grassland habitats characteristic of the Downland landscape have suffered significant losses since the Second World War, and the remaining sites are small, fragmented and confined to the more marginal areas, often the steeper slopes. The recreation and regeneration of these species-rich grasslands has become an important aim of regional conservation organisations, but the methods and mechanisms by which restoration sites could be identified has not been clarified. The work reported here aims, by the integration of landscape ecology and Geographical Information Systems, to develop a sound methodological approach for the targeting of sites for restoration and regeneration of calcareous grassland on the South Downs. The study examines temporal land use dynamics of the Downland and the predominant land conversion sequences are identified. Land management changes between 1971 and 1991 are assessed, and sites of unimproved grassland and those sites considered more marginal to modem intensive farming approaches are identified. The structure of the Downs landscape is investigated at the landscape, habitat and community level using fine-scale spatial data. Particular emphasis is placed upon the remaining calcareous resource and the extent of habitat loss and fragmentation is quantified. Analysis at the community level shows a clear relationship between community richness and habitat area. The analysis identifies key attributes of calcareous grassland sites and provides a baseline from which to formulate restoration targets and objectives. Finally, using fuzzy logic, a GIS-based Habitat Suitability Model is developed for use as a tool to support strategic landscape evaluation and to provide a method of identifying areas of search and site selection for targeted restoration. The approach models the relationships between specific grassland communities and landscape position, and is applied to the South Downs landscape in order to predict the nature of grassland communities likely to result from restoration efforts at specific sites.
120

An improved method for register file verification

Quan, Tong 2009 August 1900 (has links)
Register file logic verification historically involves comparing two human generated logic sources such as a VHDL code file and a circuit schematic for logic equivalence. This method is valid for most cases, however it does not account for instances when both logic sources are equivalent but incorrect. This report proposes a method to eliminate this problem by testing logic coherency of various sources with a golden logic source. This golden logic source will be generated by a register file simulation program which has been developed to simulate accurate regfile I/O port data. Implementation of this simulation program for logic verification will eliminate the accuracy problem stated above, in addition the logic simulation time for the new method has also been reduced by 36% compared to the former method. / text

Page generated in 0.0289 seconds