201 |
Nonabsolute/relativistic (N/R) thinking: a possible unifying commonality underlying models of postformal reasoningYan, Bernice Lai-ting 05 1900 (has links)
This dissertation identified and addressed four of the unresolved issues pertaining
to the proposition that nonabsolute/ relativistic (N/R) thinking is one of the possible
unifying commonalities underlying the selected models of postformal reasoning, namely
Problem Finding, Dialectical Reasoning, Relativistic Operations and Reflective
Judgment.
A total of 254 participants aged 10 to 48 and attending Grade 5 to doctoral studies
were involved. Each participant was administered eight tests in pencil-and-paper format
to measure eight different constructs of thinking. Different specific hypotheses were
evaluated through different statistical approaches.
The four identified issues were addressed as follows:
Firstly, nonabsolute/ relativistic thinking was reconceptualized and operationally
defined as a multidimensional and multilevel construct. Two dimensions were proposed:
the basic form and the epistemic view. Within the basic form dimension, two levels were
proposed: the formal and the postformal forms.
Secondly, a battery of three tests was specifically designed by Arlin and the
author to measure the different dimensions and levels of nonabsolute/ relativistic
thinking.
Thirdly, strong empirical evidence was obtained supporting the general
hypothesis that nonabsolute/ relativistic thinking is a possible unifying commonality
underlying the four selected postformal models. Within the construct of nonabsolute/
relativistic thinking, two dimensions, the basic form and the epistemic view, can be
differentiated as hypothesized.
Fourthly, empirical evidence was also obtained supporting the general hypothesis
that nonabsolute/ relativistic thinking is an instance of both formal and postformal
reasoning. Specifically within the basic form dimension, two qualitatively different
forms, the formal and the postformal, can be differentiated as hypothesized. Findings
also suggested that the development of a nonabsolute epistemic view might play a crucial
role in the development of the postformal form. Therefore, the emergence of the
postformal form can be explained by a paradigm shift from an absolute to a nonabsolute
epistemic view. Performances in the tests of the postformal form and of the epistemic
view in combination were found to be good predictors of performances in the selected
postformal tests.
Significant implications of the findings are that nonabsolute/ relativistic thinking
represents a form of metamorphosis from closed-system to open-system thinking and it
might serve as a potential springboard in the development of higher order thinking.
|
202 |
A schema-based model of adaptive problem solvingTurner, Roy Marvin 12 1900 (has links)
No description available.
|
203 |
Student designs of experiments as indicators of physics reasoningLeesinsky, Peter January 1991 (has links)
The purpose of this study was the assessment of physics reasoning on the basis of students' understanding of motion on an inclined plane. Subjects were presented with a video tape showing a motion experiment in steps and were asked to formulate hypotheses and design an experiment to test these. Subjects thought aloud while specifying the designs and goals of an experiment. Protocols were analyzed by an original method using schema representation techniques. Adequancy of subjects' reasoning was evaluated by comparison to a composite model built from physics domain principles. As more information was presented to subjects, processing differences were observed. Using a hierarchy of processes from recognition to generation, five groups of subjects were defined. Subjects differed in recognition and inclusion processes, use of incoming information, ability to generate experimental designs, and responses to falsification. Concepts of average velocity and differences in directionality of reasoning were analyzed.
|
204 |
Contrasting associative and statistical theories of contingency judgmentsMehta, Rick R. January 2000 (has links)
"Blocking" refers to judgments of a moderate contingency being lowered when contrasted with a strong contingency. The Rescorla-Wagner model and causal model theory account for blocking through different mechanisms. To examine the predictions from these two models, seven experiments tested the extent to which "causal scenario" and "causal order" would influence whether blocking was observed in human contingency learning tasks. "Causal scenario" was manipulated by contrasting responses to two causes of one effect or to one cause of two effects; "causal order" was defined as causes preceding effects or effects preceding causes. The four conjunctions of these two factors were investigated separately in Experiments 1 to 5. In Experiments 1 and 2, two causes preceded one effect and two effects preceded one cause, respectively. Blocking was observed regardless of whether the predictors were causes or effects. In Experiments 3, 4 and 5, participants were presented with one antecedent cue and made separate predictions about each of the trial's two outcomes. Blocking was not observed, irrespective of whether the antecedent cue was a cause or an effect. These initial results were consistent with the Rescorla-Wagner model. An alternative explanation was that blocking failed to occur in Experiments 3 to 5 because participants were asked questions between the predictor and two outcomes. Predicting the outcomes might have implicitly led participants to monitor them separately and to report on subsets of the data at the time of judgment. To address this issue, the volunteers in Experiment 6 observed the events on each trial but did not make any predictions about the outcomes. Blocking was observed, signifying that the intervening questions between the antecedent and consequent cues constitute an important variable influencing cue competition effects. In Experiment 7, all four conjunctions of causal scenario and causal order were tested simultaneously. Furthermore, participants w
|
205 |
Predictive conditionals, nonmonotonicity and reasoning about the futureBell, J. January 1988 (has links)
No description available.
|
206 |
A study of the relationship between critical thinking ability and grades in public speaking classesFriedley, Sheryl Ann January 1972 (has links)
Previous research in the field of speech has indicated that critical thinking can be improved through training in discussion, debate, argumentation, and the basic speech class. Critical thinking has also been related to fluency in extemporaneous speaking. The purpose of this study is to test the hypothesis that there is no significant difference between students' grades in Speech 210 and their scores on the Watson-Glaser Critical Thinking Appraisal. Can the Watson-Glaser Test predict the students' grades in a public speaking class? The hypothesis is also treated with respect to sex, class, selected majors, and delayed acceptance and regular students.The study employed seven statistical tests: the Kruskal-Wallis Rank Test of Significance, the Chi Square Test for Two Independent Samples, the Fisher Exact Probability Test, the Median Test, the T Test for Unmatched Pairs, and the Chi Square Test for "Goodness of Fit" with (a) expected values equal and (b) expected values unequal. The teats, programmed on a Monroe 1766 electronic calculator, were applied to the five null hypotheses. The tests indicate significant correlations in two areas: Watson-Glaser raw scores with respect to psychology majors and marketing majors, and Watson-Glaser raw scores with respect to delayed acceptance and selected sample students. No statistically significant correlation was found between Watson-Glaser scores and final grades given in Speech 210 as a whole, class, sex, and selected majors.
|
207 |
Individual differences in cognitive plasticity and variability as predictors of cognitive function in older adultsGrand, Jacob Harold Gross 11 April 2012 (has links)
Background: With the growth in elderly populations worldwide, there is a pressing need to characterize the changes in cognition and brain function across the adult lifespan. The evolution of cognitive abilities is no longer considered to reflect a universal, cumulative process of decline. Rather, significant inter- and intra-individual differences exist in cognitive trajectories, with the maintenance of functions ultimately determined by multi-dimensional biological and psychological processes. The current study examined the relationship between intra-individual variability, cognitive plasticity, and long-term cognitive function in older adults. Methods: Data were analyzed from Project Mental Inconsistency in Normals & Dementia (MIND), a 6-year longitudinal burst design study, integrating micro-weekly assessments (reaction time (RT) tasks), with macro-annual evaluations (cognitive outcome measures). Participants included 304 community-dwelling adults, ranging in age from 64 to 92 years (M = 74.02, SD = 5.95). Hierarchical multiple regression models were developed to examine long-term cognitive function, along with multilevel modeling (HLM) techniques for the analysis of specific predictors of longitudinal rates of cognitive change. Results: Baseline intraindividual variability (ISD) emerged as a robust and highly sensitive predictor, with increased variability associated with decreased long-term cognitive performance. Complex baseline cognitive plasticity (1-Back 4-Choice RT Task) uniquely predicted subsequent cognitive function for measures of processing speed, fluid reasoning, episodic memory, and crystallized verbal ability. Multilevel models revealed chronological age to be a significant predictor across cognitive domains, while intraindividual variability selectively predicted rates of change for performance on measures of episodic memory and crystallized verbal ability. Conclusion: These findings underscore the potential utility of intraindividual variability and cognitive plasticity as dynamic predictors of longitudinal change in older adults. / Graduate
|
208 |
Legal knowledge-based systems : new directions in system designAikenhead, Michael January 2001 (has links)
This thesis examines and critiques the concept of 'legal knowledge-based’ systems. Work on legal knowledge-based systems is dominated by work in 'artificial intelligence and law’. It seeks to automate the application of law and to automate the solution of legal problems. Automation however, has proved elusive. In contrast to such automation, this thesis proposes the creation of legal knowledge-based systems based on the concept of augmentation of legal work. Focusing on systems that augment legal work opens new possibilities for system creation and use. To inform how systems might augment legal work, this thesis examines philosophy, psychology and legal theory for information they provide on how processes of legal reasoning operate. It is argued that, in contrast to conceptions of law adopted in artificial intelligence and law, 'sensemaking' provides a useful perspective with which to create systems. It is argued that visualisation, and particularly diagrams, are an important and under considered element of reasoning and that producing systems that support diagramming of processes of legal reasoning would provide useful support for legal work. This thesis reviews techniques for diagramming aspects of sensemaking. In particular this thesis examines standard methods for diagramming arguments and methods for diagramming reasoning. These techniques are applied in the diagramming of legal judgments. A review is conducted of systems that have been constructed to support the construction of diagrams of argument and reasoning. Drawing upon these examinations, this thesis highlights the necessity of appropriate representations for supporting reasoning. The literature examining diagramming for reasoning support provides little discussion of appropriate representations. This thesis examines theories of representation for insight they can provide into the design of appropriate representations. It is concluded that while the theories of representation that are examined do not determine what amounts to a good representation, guidelines for the design and choice of representations can be distilled. These guidelines cannot map the class of legal knowledge-based systems that augment legal sensemaking, they can however, be used to explore this class and to inform construction of systems.
|
209 |
Integrating Probabilistic Reasoning with Constraint SatisfactionHsu, Eric 09 June 2011 (has links)
We hypothesize and confirm that probabilistic reasoning is closely related to constraint satisfaction at a formal level, and that this relationship yields effective algorithms for guiding constraint satisfaction and constraint optimization solvers.
By taking a unified view of probabilistic inference and constraint reasoning in terms of graphical models, we first associate a number of formalisms and techniques between the two areas. For instance, we characterize search and inference in constraint reasoning as summation and multiplication (or disjunction and conjunction) in the probabilistic space; necessary but insufficient consistency conditions for solutions to constraint problems (like arc-consistency) mirror approximate objective functions over probability distributions (like the Bethe free energy); and the polytope of feasible points for marginal probabilities represents the linear relaxation of a particular constraint satisfaction problem.
While such insights synthesize an assortment of existing formalisms from varied research
communities, they also yield an entirely novel set of “bias estimation” techniques that contribute to a growing body of research on applying probabilistic methods to constraint problems. In practical terms, these techniques estimate the percentage of solutions to a constraint satisfaction or optimization problem wherein a given variable is assigned a given value. By devising search methods that incorporate such information as heuristic guidance for variable and value ordering, we are able to outperform existing solvers on problems of interest from constraint satisfaction and constraint optimization–-as represented here by the SAT and MaxSAT problems.
Further, for MaxSAT we present an equivalent transformation” process that normalizes the
weights in constraint optimization problems, in order to encourage prunings of the search tree during branch-and-bound search. To control such computationally expensive processes, we determine promising situations for using them throughout the course of an individual search process. We accomplish this using a reinforcement learning-based control module that seeks a principled balance between the exploration of new strategies and the exploitation of existing
experiences.
|
210 |
Clinical case similarity and diagnostic reasoning in medicineArocha, José F. (José Francisco) January 1991 (has links)
This thesis describes a study of novice problem solving in the domain of medicine. The study attempts to answer questions pertaining to the diagnostic accuracy, the generation and change of diagnostic hypotheses, and the use of clinical findings in the course of solving clinical cases with similar presenting complaints. Two specific issues are addressed: (1) how does the initial case presentation suggesting a common disease schema affect the diagnostic problem solving process of novices and intermediate subjects? (2) what are the processes the subjects used in coordinating hypothesis and evidence during diagnostic problem solving? / Medical trainees (students and a resident) were given four clinical cases to solve and think-aloud protocols were collected. The verbal protocols were analyzed using methods of protocol analysis. The results show that second year medical students interpreted clinical cases in terms of the more common disease schema, regardless of the initial presentation of the case. More advanced students, although unable to make a correct diagnosis in most instances, were less susceptible to such confusions. Only the resident was able to interpret the cases in terms of different disease schemata, reflecting knowledge of the underlying disease process. The semantic analysis of the protocols revealed that most students, especially at lower levels of training, misinterpreted or ignored the evidence that contradicted their initial hypotheses and made use of a mixture of forward and backward reasoning; a finding consistent with previous research. Implications for educational training and for a theory of novice problem solving in medicine are presented.
|
Page generated in 0.0768 seconds