391 |
Aristotle's modal ontologyDickson, Mark William January 1989 (has links)
ModaI logic is concerned with the logic of
necessity and possibility. The central problem of modal
ontology is summed up in the following question, "What
are the ontological commitments of the user of modal
terminology? " This thesis is primarily about the
ontological commitments that Aristotle made when he
employed modal terms. Aristotle’s modal ontology is h e r e
analysed in conjunction with four modal problems. My
primary objective, is to clarify some of the discussions
of Aristotle's modal ontology that have been advanced by
certain twentieth century philosophers.
The first problem to be considered is the famous
' sea battle’ argument of De Interpretatione 9 . Here is
a summary of the problem: If it is currently true that
there will be a sea battle tomorrow, then in
some sense it is inevitable that there will in fact be a
sea battle; if predictions are true, is not a form of
determinism being supported? One analysis in particular
is studied at length, namely that of Jaakko Hintikka.
Hintikka holds that the sea battle argument is best
Interpreted if the metaphysical principle of plenitude
is attributed to Aristotle. The principle of plenitude
effectively merges modality with temporality; what is necessarily
the case is always true, and vice versa.
Hintikka also interprets Aristotle's stand on the
‘Master Argument’ of Diodorus in light of the
attribution of the principle of plenitude to Aristotle.
Diodorus' argument is the second of the four problems
that this essay considers,. Unlike Aristotle, Diodorus
appears to have favored a strong version of determinism.
According to Hintikka, Diodorus actually strove to
prove the principle of plenitude (as opposed to assuming
it, as Aristotle presumably did).
I am very sceptical regarding Hintikka's
interpretations of these two problems. The sea battle
argument is not adequately answered by the solution
which Hintikka sees Aristotle adopting. Alternative
answers are relatively easy to come by. The evidence
cited by Hintikka for ascribing the principle of
plenitude is, it is shown, somewhat inconclusive.
As for the Master Argument, there is a great deal of
paucity in regards to textual evidence. Hinikka himself
virtually concedes this point. (Thus, whereas I feel it
to be incumbent to offer an alternative interpretation
of the sea battle argument, I do not share this attitude
towards the Master Argument.)
The third and fourth problems play a key role in
twentieth century analytic philosophy. Both were first formulated
by W.V. Quine in the forties. These problems
are somewhat subtle and will not be explained further.
Suffice it to say that an analysis of Aristotle's works
by Alan Code reveals that the Stagirite had an answer to
Quine's criticisms of modal logic. / Arts, Faculty of / Philosophy, Department of / Graduate
|
392 |
Semantic studies of intuitionistic logicCriscuolo, Giovanni January 1972 (has links)
This thesis is a study of intuitionistic semantics as presented by Beth [2] and Kripke [12], using the usual methods of investigation
of classical informal logic.
Beth models and Kripke models are presented in a manner which does not depend upon a prior definition of the notion of degree of a formula.
It is shown that both classes of intuitionistic models are a generalization
of the concept of classical models, i.e. they contain classical models as particular case, and that " branching " is a necessary condition in order that intuitionistic logic be complete with respect to them.
Intuitionistic sentential calculus is complete with respect to the strong Beth model, the intersection of Beth models and Kripke models.
But, if by analogy with the classical case, we extend them to first order logic we find that they are not adequate because, for example, the sentence [formula omitted]x(B(x) V C) [formula omitted] ([formula omitted]xB(x) V C), where C does not contain x free, is valid in these models but not intuitionistically provable.
This observation helps to explain the formal differences between the two classes of models. The simplified Kripke models and the simplified Beth models are then introduced and their equivalence with the Kripke models and the Beth models, respectively, is proved.
The first ones allow a better notation and a better understanding of the relation R occurring in the definition of Kripke models. The second ones have the important property that, if the domain is finite, any classically valid sentence is valid in them. Finally a semantic proof of most of the reduction theorems from classical to intuitionistic logic is given. / Science, Faculty of / Computer Science, Department of / Graduate
|
393 |
Khovanov Homology of KnotsSöderberg, Christoffer January 2017 (has links)
No description available.
|
394 |
Approaches to procedural adequacy in logic programming using connection graphsMoens, Theodore Warren Bernelot January 1987 (has links)
Kowalski's connection graph method provides a representation for logic programs which allows for the incorporation of better procedural control techniques than standard logic programming languages. A proposed search strategy for visual recognition which combines top-down and bottom-up techniques has been incorporated in a connection graph implementation.
The connection graph representation also allows for the natural incorporation of constraint satisfaction techniques in logic programming. Kowalski's approach to incorporating constraint satisfaction techniques in connection graphs is examined in detail. It is shown that his approach is not efficient enough to be used as a general preprocessing algorithm but that a modified version may be of use.
Increased control of search and the incorporation of consistency techniques increase the procedural adequacy of logic programs for representing knowledge without compromising the descriptive capacity of the form. / Science, Faculty of / Computer Science, Department of / Graduate
|
395 |
A default logic approach to the derivation of natural language presuppositionsMercer, Robert Ernest January 1987 (has links)
A hearer's interpretation of the meaning of an utterance consists of more than what is conveyed
by just the sentence itself. Other parts of the meaning are produced as inferences from three knowledge sources: the sentence itself, knowledge about the world, and knowledge about language use. One inference of this type is the natural language presupposition. This category of inference is distinguished by a number of features: the inferences are generated only, but not necessarily, if certain lexical or syntactic environments are present in the uttered sentence; normal interpretations of these presuppositional environments in the scope of a negation in a simple sentence produce the same inferences as the unnegated environment; and the inference can be cancelled by information in the conversational context.
We propose a method for deriving presuppositions of natural language sentences that has its foundations in an inference-based concept of meaning. Whereas standard (monotonic) forms of reasoning are able to capture portions of a sentence's meaning, such as its entailments, non-monotonic forms of reasoning are required to derive its presuppositions. Gazdar's idea of presuppositions being consistent with the context, and the usual connection of presuppositions with lexical and syntactic environments motivates the use of Default Logic as the formal nonmonotonic
reasoning system. Not only does the default logic approach provide a natural means to represent presuppositions, but also a single (slightly restricted) default proof procedure is all that is required to generate the presuppositions. The naturalness and simplicity of this method contrasts with the traditional projection methods. Also available to the logical approach is the proper treatment of 'or' and 'if ... then ...' which is not available to any of the projection methods.
The default logic approach is compared with four others, three projection methods and one non-projection method. As well as serving the function of demonstrating empirical and methodological difficulties with the other methods, the detailed investigation also provides the motivation for the topics discussed in connection with default logic approach. Some of the difficulties have been solved using the default logic method, while possible solutions for others have only been sketched.
A brief discussion of a new method for providing corrective answers to questions is presented.
The novelty of this method is that the corrective answers are viewed as correcting presuppositions of the answer rather than of the question. / Science, Faculty of / Computer Science, Department of / Graduate
|
396 |
Counterfactual thinking in the wake of traumaDavis, Christopher G. 11 1900 (has links)
Counterfactuals generated by people who have experienced
traumatic life events were examined to elucidate their
significance for the coping process. In Study 1, 93 respondents
were interviewed 4-7 years after the loss of their spouse or
child in a motor vehicle accident. In Study 2, 124 respondents
were interviewed 3 weeks and 18 months following the death of
their child to Sudden Infant Death Syndrome. Across these two
studies it was found that (a) counterfactuals that undid the
traumatic event were commonly reported; (b) the focus of
counterfactuals was typically on one's own (in)actions, rather
than on the behavior of others; (c) the more freguently
respondents were undoing the event, the more distress they
reported; and (d) this relation held even after controlling for
more general ruminations. In Study 3, 106 respondents were
interviewed one week following their spinal cord injury. In this
study, self-implicating counterfactuals were shown to predict
ascriptions of self-blame, controlling for causal attributions
and foreseeability estimates. Taken together, these field data
suggest that counterfactuals play an important role in how people
cope with traumatic life events. Possible roles that these
counterfactual thoughts might play are discussed. / Arts, Faculty of / Psychology, Department of / Graduate
|
397 |
The effectiveness of logical reasoning on the solution of value problemsSchactman, Chuck Seymour January 1976 (has links)
Certain values education programs have been recently developed which emphasize teaching students to gain ability in critical, deductive reasoning. The major contention of this paper was that this type of reasoning is not entirely adequate for the solution of certain value loaded problems. In order to empirically test this hypothesis, a group of university students trained in formal logic was selected. Then three tests of logic were devised — one symbolic, one verbal and neutral, and the third verbal and value loaded. On three different sessions these tests were administered so that each subject attempted each test. Every item across the three tests was exactly the same in terms of logical content. The results were then tabulated and the analyses performed. The results showed support for the major hypothesis, that subjects perform significantly different on tests incorporating the same logic, but whose content differs. These results were then viewed in relation to values education programs stressing deductive reasoning and to the educational implications that may arise. Finally it was concluded that if transfer of learning to real life situations is a goal of education, then the programs mentioned are insufficient for the realization of these goals, and that the inclusion of educational procedures in the affective and perceptual, as well as the cognitive domains, is necessary for the successful transfer of learned strategies to everyday life situations. / Arts, Faculty of / Psychology, Department of / Graduate
|
398 |
Systems of quantum logicHughes, Richard Ieuan Garth January 1978 (has links)
According to quantum mechanics, the pure states of a microsystem
are represented by vectors in a Hilbert Space. Sentences of the form, "x є L" (where x is the state vector for a system, L a subspace of the appropriate Hilbert space), may be called Q-propositions: such sentences serve to summarise our information about the results of possible experiments on the system. Quantum logic investigates the relations which hold among the Q-propositions about a given physical sys tem.
These logical relations correspond to algebraic relations among the subspaces of Hilbert space. The algebra of this set of subspaces is non-Boolean, and may be regarded either as an orthomodular lattice or as a partial Boolean algebra. With each type of structure we can associate a logic.
A general approach to the semantics for such a logic is provided in terms of interpretations of a formal language within an algebraic structure; an interpretation maps sentences of the language homomorphically onto elements of the structure. When the structure in question is a Boolean algera, the resulting logic is classical; here we develop a semantics for the logic associated with partial Boolean algebras.
Two systems of proof, based on the natural deduction systems of Gentzen, are shown for this logic. With respect to the given sematics, these calculi are sound and weakly complete. Strong completeness is conjectured.
Quantum logic deals with the logical relations between sentences, and so is properly called a logic. However, it is the logic appropriate to a limited class of sentences: proposals that it should replace classical logic wherever the latter is used should be viewed with suspicion. / Arts, Faculty of / Philosophy, Department of / Graduate
|
399 |
A critical analysis of the thesis of the symmetry between explanation and prediction : including a case study of evolutionary theoryLee, Robert Wai-Chung January 1979 (has links)
One very significant characteristic of Hempel's covering-law models of scientific explanation, that is, the deductive-nomological model and the inductive-statistical model, is the supposed symmetry between explanation and prediction. In brief, the symmetry thesis asserts that explanation and prediction have the same logical structure; in other words, if an explanation of an. event had been taken account of in time, then it could have served as a basis for predicting the event in question, and vice versa. The present thesis is a critical analysis of the validity of this purported symmetry between explanation and prediction.
The substance of the thesis begins with a defence against some common misconceptions of the symmetry thesis, for example, the idea that the symmetry concerns statements but not arguments. Specifically, Grunbaum's interpretation of the symmetry thesis as pertaining to the logical inferability rather than the epistemological symmetry between explanation and prediction is examined.
The first sub-thesis of the symmetry thesis, namely that "Every adequate explanation is a potential prediction," is then analyzed. Purported counterexamples such as evolutionary theory and the paresis case are critically
examined and consequently dismissed. Since there are conflicting views regarding the nature of explanation and prediction in evolutionary theory, a case study of the theory is also presented.
Next, the second sub-thesis of the symmetry thesis, namely that "Every adequate prediction is a potential explanation," is discussed. In particular, the barometer case is discharged as a counterexample to the second sub-thesis when the explanatory power of indicator laws is properly understood.
Finally, Salmon's current causal-relevance model of explanation, which claims to be an alternative to Hempel's inductive-statistical model, is critically analyzed. A modified inductive-statistical model of explanation is also proposed. This modified model retains the nomological ingredient of Hempel's original inductive-statistical model, but it is immune to criticisms raised against the latter.
In conclusion, I maintain that there is indeed a symmetry between explanation and prediction. But since deductive-nomological explanation and prediction are essentially different from inductive-statistical explanation and prediction, the form the symmetry takes between deductive-nomological explanation and prediction differs from the form it exhibits between inductive-statistical explanation and prediction. / Arts, Faculty of / Philosophy, Department of / Graduate
|
400 |
Thought experiments in ethics : a contexualist approach to the grounding problemHarland, Anne 05 1900 (has links)
How can an experiment which occurs only in thought lead to new and accurate
conclusions about the world beyond thought? What makes thought experiments relevant
to the domains they are designed to explore?
One answer is that successful thought experiments are grounded. Explaining the
nature of this grounding relationship, especially as it applies to ethics, is the main task of
this dissertation.
A thought experiment is an experiment that occurs in thought. The "thought"
label distinguishes it from an ordinary physical experiment, while the "experiment" label
distinguishes it from other types of merely analogical, conjectural, or hypothetical
reasoning. Many of the components that are necessary for a successful physical
experiment are also necessary for a successful thought experiment. A thought
experiment, like a physical experiment, must isolate and vary variables in order to answer
a question within a given theoretical context. The result of the experiment has
repercussions for its theoretical context.
The grounding relationship holds between the components of the thought
experiment and the theoretical context of the thought experiment. In order for the
thought experiment to be successful, both the experimental set-up and our responses to it
need to be grounded in the thought experiment's theoretical context.
An experimental set-up will be grounded whenever it meets the following
conditions. The concepts used must be defined normally, dependent and independent
variables must be isolated and relevantly related, and the propositions of the thought
experiment (excepting those describing extraneous particulars) must be relevantly related
to the given theoretical context and the question under examination.
Grounding responses to thought experiments will then be largely a matter of
anticipating and disarming distorting influences. Factors influencing responses include
the individual's knowledge of the theoretical context, the state of development of that
context, the nature of the presentation of the thought experiment, and subjective filters.
It is sometimes difficult to ascertain whether a thought experiment in ethics is
grounded. This is largely due to the nature of the theoretical context of thought
experiments in ethics. In order to assess the relationship of thought experiments in ethics
to their theoretical context, I advocate employing a contextualist methodology involving
the process of wide reflective equilibrium. While contextualists use this approach to
arrive at considered judgements relating to specific ethical problems, I show that wide
reflective equilibrium can also be used to examine the grounding of thought experiments.
I conclude the dissertation with an examination of the relationship of thought
experiments to computer simulations, a study of various common thought experiment
distortions, and some tests and methods designed to aid constructing successful thought
experiments. / Arts, Faculty of / Philosophy, Department of / Graduate
|
Page generated in 0.08 seconds