• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 52
  • 13
  • 5
  • 5
  • 4
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 104
  • 20
  • 19
  • 13
  • 12
  • 10
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Counterfactual thinking and cognitive consistency

Uldall, Brian Robert 02 December 2005 (has links)
No description available.
22

Exploring the determinants of dual goal facilitation in Wason's 2-4-6 task

Gale, Maggie January 2008 (has links)
The standard paradigm for exploring hypothesis testing behaviour is Wason's (1960) rule discovery task, which exists in two variants: the standard single goal (SG) task, and the logically identical dual goal (DG) fonn. Despite the close similarity of the two fonns of the task, the reported success rates in the two variants vary considerably, with approximately 20% of participants successfully solving the SG variant compared to over 60% correctly announcing the rule in the DG fonn. It was this disparity between the patterns of perfonnance across the two versions of the task which fonned the impetus for this thesis, as it was felt that an explanation for the facilitatory effect of DG instructions would lead to insights into the poor performance in the SG form. Several competing contemporary accounts of the effect are introduced, and predictions derived from them empirically tested across a series of seven experiments. Data analyses showed that no single contemporary theory could provide a wholly adequate account of the DG facilitation effect. However, these analyses led to a novel observation: that it is the production of a contrast class triple which appears to be the key predictor of success on the task, and furthennore, that the DG variant of the task promotes the generation of such a triple. Support for the "contrast class" account of the DG effect was provided by direct manipulation of the information provided to participants. A theoretical account of the critical role of contrast class cue information is developed in the thesis by situating the account within a proposed extension to Oaksford and Chater's (1994) "Iterative Counterfactual Model" of hypothesis testing. It is further suggested that rather than providing mutually exclusive accounts of the DG effect, competing theories (e.g., Vallee-Tourangeau et al. 's, 1995, triple heterogeneity theory, and Wharton et al. 's, 1993, information quantity theory) could be subsumed within this new model, which would then reflect a process whereby participants' strategies change and develop over the course of the hypothesis testing session. Finally, it is suggested that findings from this thesis can be accommodated more generally within Evans' (2006) "hypothetical thinking framework", and thereby within contemporary dual process accounts of reasoning.
23

Treatment heterogeneity and potential outcomes in linear mixed effects models

Richardson, Troy E. January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Gary L. Gadbury / Studies commonly focus on estimating a mean treatment effect in a population. However, in some applications the variability of treatment effects across individual units may help to characterize the overall effect of a treatment across the population. Consider a set of treatments, {T,C}, where T denotes some treatment that might be applied to an experimental unit and C denotes a control. For each of N experimental units, the duplet {r[subscript]i, r[subscript]Ci}, i=1,2,…,N, represents the potential response of the i[superscript]th experimental unit if treatment were applied and the response of the experimental unit if control were applied, respectively. The causal effect of T compared to C is the difference between the two potential responses, r[subscript]Ti- r[subscript]Ci. Much work has been done to elucidate the statistical properties of a causal effect, given a set of particular assumptions. Gadbury and others have reported on this for some simple designs and primarily focused on finite population randomization based inference. When designs become more complicated, the randomization based approach becomes increasingly difficult. Since linear mixed effects models are particularly useful for modeling data from complex designs, their role in modeling treatment heterogeneity is investigated. It is shown that an individual treatment effect can be conceptualized as a linear combination of fixed treatment effects and random effects. The random effects are assumed to have variance components specified in a mixed effects “potential outcomes” model when both potential outcomes, r[subscript]T,r[subscript]C, are variables in the model. The variance of the individual causal effect is used to quantify treatment heterogeneity. Post treatment assignment, however, only one of the two potential outcomes is observable for a unit. It is then shown that the variance component for treatment heterogeneity becomes non-estimable in an analysis of observed data. Furthermore, estimable variance components in the observed data model are demonstrated to arise from linear combinations of the non-estimable variance components in the potential outcomes model. Mixed effects models are considered in context of a particular design in an effort to illuminate the loss of information incurred when moving from a potential outcomes framework to an observed data analysis.
24

Pattern recognition in astrophysics and the anthropic principle

Darg, Daniel W. January 2012 (has links)
The role of the Anthropic Principle in astrophysics and cosmology is examined in two principal parts. The first (minor) part takes a chiefly philosophical perspective and examines the manner in which human cognition features into discussions on cosmic origins. It is shown that the philosophical questions raised by the Anthropic Principle and ‘fine-tuning of life’ bear resemblances to problems within the philosophy of mind and we seek a common origin for this surprising parallel. A form of ‘epistemic structural realism’ is defended and used to critique the physicalist identity thesis. It is argued that equating ‘reality’ with mathematical structures, which is the basis of the identity thesis, leads to incoherent conclusions. Similar reasoning is used to critique infinite Multiverse theories. In the second (major) part, we gradually transition into mainstream astrophysics, first presenting a new line of research to explore counterfactual universes using semi-analytic models (SAMs) and offering a preliminary study wherein the cosmological constant is varied and the effects on ‘advanced civilisations’ are examined. The importance of galaxy mergers is highlighted and leads to their study. We first try solving the pattern-recognition problem of locating mergers using the Galaxy Zoo database and produce the largest homogenous merger catalogue to date. We examine their properties and compare them with the SAMs of the Millennium Simulation finding good general agreement. We develop the Galaxy Zoo approach with a new visual-interface design and double the size of the merger catalogue of SDSS mergers in the local Universe.
25

Lewis’ Theory of Counterfactuals and Essentialism

Lippiatt, Ian 12 1900 (has links)
La logique contemporaine a connu de nombreux développements au cours de la seconde moitié du siècle dernier. Le plus sensationnel est celui de la logique modale et de sa sémantique des mondes possibles (SMP) dû à Saul Kripke dans les années soixante. Ces dans ce cadre que David Lewis exposera sa sémantique des contrefactuels (SCF). Celle-ci constitue une véritable excroissance de l’architecture kripkéenne. Mais sur quoi finalement repose l’architecture kripkéenne elle-même ? Il semble bien que la réponse soit celle d’une ontologie raffinée ultimement basée sur la notion de mondes possible. Ce mémoire comporte quatre objectifs. Dans un premier temps, nous allons étudier ce qui distingue les contrefactuels des autres conditionnels et faire un survol historique de la littérature concernant les contrefactuels et leur application dans différent champs du savoir comme la philosophie des sciences et l’informatique. Dans un deuxième temps, nous ferons un exposé systématique de la théorie de Lewis telle qu’elle est exposée dans son ouvrage Counterfactuals. Finalement, nous allons explorer la fondation métaphysique des mondes possible de David Lewis dans son conception de Réalisme Modal. / Modern logic since the end of the Second World War has undergone many developments. Two of the most interesting of these are the Kripkian Possible World Semantics and Lewis’ system of Counterfactuals. The first was developed by Saul Kripke in the 1960s and the second was developed by David Lewis in the 1970s. In some senses we can say that Lewis’ system of counterfactuals or Counter Factual Semantics (CFS) is built on top of the architecture which Kripke created with his Possible Worlds Semantics (PWS). But, what is the Kripkian Possible World Semantics itself built on? The answer it seems is very finely tuned ontology founded on the notion of possible worlds. This paper will attempt to do the following. First, attempt to draw a distinction between on the one hand conditionals and the other counterfactuals and at the same time attempt to look at some of the historical literature surrounding counterfactuals and their application in various fields like the philosophy of science. Second, recapitulate Lewis’ system of counterfactual semantics as developed primarily in Lewis’ book Counterfactuals. Finally this paper will attempt to explore the metaphysical foundations of the possible worlds account argued for by David Lewis in his conception of Modal Realism.
26

Análise experimental sobre o julgamento da relevância do valor justo em ativos biológicos / Experimental analysis on judgment of the relevance of fair value of biological assets.

Silva, José Marcos da 21 November 2013 (has links)
A utilização do valor justo, na avaliação de ativos biológicos, decorrentes da adoção de padrões internacionais de contabilidade, tem provocado efeitos econômicos significativos sobre o valor das empresas e, consequentemente, nos seus resultados correntes e futuros. Dessa maneira, este trabalho tem como objetivo analisar se os usuários da informação contábil reconhecem a relevância do uso do valor justo na mensuração de ativos biológicos. Por meio de experimentos com alunos de MBA, sob a perspectiva da Teoria do Pensamento Contrafactual, foram considerados os seguintes estímulos sobre o julgamento da relevância do uso do valor justo para ativos biológicos: (i) se o resultado (perdas ou ganhos) decorrentes da avaliação a valor justo, (ii) se o tipo de ativo biológico (com ou sem liquidez) e (iii) se a decisão gerencial (manter o ativo até o vencimento ou disponibilizar para a venda) interferem no julgamento da relevância do uso do valor justo. Os resultados apontam que, mesmo com a presença dessas variáveis, o uso do valor justo é relevante para mensuração dos ativos biológicos. / The use of fair value in the measurement of biological assets resulting from the adoption of international accounting standards, has caused significant economic effect on the value of firms and hence on their current and future results. Thus, this study aims to examine whether users of accounting information recognize the relevance of the use of fair value measurement of biological assets. Through experiments with MBA students from the perspective of the Theory of Counterfactual Thinking, we considered the following stimuli on the trial of the relevance of the use of the fair value of biological assets: (i) if the result (gain or loss) arising from measurement at fair value, (ii) the type of biological assets (with or without liquid) and (iii) if the decision management (maintaining the asset to maturity or available for sale) interfere with the judgment of the relevance of the use of value fair. The results show that, even with the presence of these variables, the use of fair value is relevant for measurement of biological assets.
27

Conceivability and Possibility : Counterfactual Conditionals as Modal Knowledge?

Holmlund, Erik January 2019 (has links)
Hur har vi kunskap om vad som är möjligt? Enligt vad som kan betraktas som det traditionella svaret till den frågan, har vi kunskap om modalitet via föreställningsbarhet. Vi föreställer oss ting och tar sedan detta som bevis för möjlighet. Denna uppsats kommer att undersöka tre invändningar till detta svar angående hur vi har kunskap om möjlighet. Vi kommer sedan att överväga Williamsons förmodan: att vår kognitiva kapacitet för att hantera kontrafaktiska konditionaler bär med sig den kognitiva kapaciteten för oss att även hantera metafysisk modalitet (2007, 136), och undersöka om denna förmodan undviker dessa invändningar. Det kommer här att argumenteras att Williamson’s förmodan undviker två av invändningarna och att den inte tycks kunna svara på den sista invändningen. Det kommer även att argumenteras att en invändning mot Williamson’s förmodan ser ut att vara särskilt problematisk, och att det inte är klart att Williamson’s förmodan är i någon bättre position än den negativa föreställningsbarhets vyn. / How do we have knowledge of what is possible? On what could be considered as the traditional response to this question, we have knowledge of modality by conceivability. We conceive of things and on the basis take this as evidence for possibility. This thesis will consider three objections to this response of how we have knowledge of possibility. We will then consider Williamson’s conjecture: that our cognitive capacity to handle counterfactual conditionals carries the cognitive capacity for us to also handle metaphysical modality (2007, 136), and see if this conjecture avoids these objections. It will be argued that Williamson’s conjecture avoids two of the objections and that it does not seem to have a response to the last objection. It will also be argued that one objection to Williamson’s conjecture seems particularly problematic, and that it is not so clear that Williamson’s conjecture is any better off than the negative conceivability view.
28

Using counterfactual regret minimization to create a competitive multiplayer poker agent

Abou Risk, Nicholas 11 1900 (has links)
Games have been used to evaluate and advance techniques in the eld of Articial Intelligence since before computers were invented. Many of these games have been deterministic perfect information games (e.g. Chess and Checkers). A deterministic game has no chance element and in a perfect information game, all information is visible to all players. However, many real-world scenarios involving competing agents can be more accurately modeled as stochastic (non-deterministic), im- perfect information games, and this dissertation investigates such games. Poker is one such game played by millions of people around the world; it will be used as the testbed of the research presented in this dissertation. For a specic set of games, two-player zero-sum perfect recall games, a recent technique called Counterfactual Regret Minimization (CFR) computes strategies that are provably convergent to an -Nash equilibrium. A Nash equilibrium strategy is very useful in two-player games as it maximizes its utility against a worst-case opponent. However, once we move to multiplayer games, we lose all theoretical guarantees for CFR. Furthermore, we have no theoretical guarantees about the performance of a strategy from a multiplayer Nash equilibrium against two arbitrary op- ponents. Despite the lack of theoretical guarantees, my thesis is that CFR-generated agents may perform well in multiplayer games. I created several 3-player limit Texas Holdem Poker agents and the results of the 2009 Computer Poker Competition demonstrate that these are the strongest 3-player computer Poker agents in the world. I also contend that a good strategy can be obtained by grafting a set of two-player subgame strategies to a 3-player base strategy when one of the players is eliminated.
29

Simultaneous Move Games in General Game Playing

Shafiei Khadem, Mohammad 06 1900 (has links)
General Game Playing (GGP) deals with the design of players that are able to play any discrete, deterministic, complete information games. For many games like chess, designers develop a player using a specially designed algorithm and tune all the features of the algorithm to play the game as good as possible. However, a general game player knows nothing about the game that is about to be played. When the game begins, game description is given to the players and they should analyze it and decide on the best way to play the game. In this thesis, we focus on two-player constant-sum simultaneous move games in GGP and how this class of games can be handled. Rock-paper-scissors can be considered as a typical example of a simultaneous move game. We introduce the CFR algorithm to the GGP community for the first time and show its effectiveness in playing simultaneous move games. This is the first implementation of CFR outside the poker world. We also improve the UCT algorithm, which is the state of the art in GGP, to be more robust in simultaneous move games. In addition, we analyze how UCT performs in simultaneous move games and argue that it does not converge to a Nash equilibrium. We also compare the usage of UCT and CFR in this class of games. Finally, we discuss about the importance of opponent modeling and how a model of the opponent can be exploited by using CFR.
30

Goal-Directed Simulation of Past and Future Events: Cognitive and Neuroimaging Approaches

Gerlach, Katrin Daniela 07 June 2014 (has links)
Goal-directed episodic simulation, the imaginative construction of a hypothetical personal event or series of events focused on a specific goal, is essential to our everyday lives. We often imagine how we could solve a problem or achieve a goal in the future, or how we could have avoided a misstep in the past, but many of the behavioral and neural mechanisms underlying such goal-directed simulations have yet to be explored. The three papers of this dissertation investigated the neural correlates of three types of future episodic simulations in Papers 1 and 2 and examined a fourth such simulation directed at past events as an adaptive, constructive process in Paper 3. Some research has associated default network activity with internally-focused, but not with goal-directed cognition. Papers 1 and 2 of this dissertation showed that regions of the default network could form functional networks with regions of the frontoparietal control network while participants imagined solving specific problems or going through a sequence of steps necessary to achieve a personal goal. When participants imagined events they associated with actually attaining a goal, default network regions flexibly coupled with reward-processing regions, providing evidence that the default network can join forces with other networks or components thereof to support goal-directed episodic simulations. Using two distinct paradigms with both young and older adults, Paper 3 focused on episodic counterfactual simulations of how past events could have turned out differently and tested whether counterfactual simulations could affect participants' memory of the original events. Our results revealed that episodic counterfactual simulations can act as a type of internally generated misinformation by causing source confusion between the original event and the imagined counterfactual outcome, especially in older adults. The findings of the three papers in this dissertation lay the groundwork for further research on the behavioral and neural mechanisms of goal-directed episodic simulations, as well as their adaptive functions and possible downsides. / Psychology

Page generated in 0.0717 seconds