1 |
Computation of context as a cognitive toolSanscartier, Manon Johanne 09 November 2006
In the field of cognitive science, as well as the area of Artificial Intelligence (AI), the role of context has been investigated in many forms, and for many purposes. It is clear in both areas that consideration of contextual information is important. However, the significance of context has not been emphasized in the Bayesian networks literature. We suggest that consideration of context is necessary for acquiring knowledge about a situation and for refining current representational models that are potentially erroneous due to hidden independencies in the data.<p>In this thesis, we make several contributions towards the automation of contextual consideration by discovering useful contexts from probability distributions. We show how context-specific independencies in Bayesian networks and discovery algorithms, traditionally used for efficient probabilistic inference can contribute to the identification of contexts, and in turn can provide insight on otherwise puzzling situations. Also, consideration of context can help clarify otherwise counter intuitive puzzles, such as those that result in instances of Simpson's paradox. In the social sciences, the branch of attribution theory is context-sensitive. We suggest a method to distinguish between <i>dispositional causes</i> and <i>situational factors</i> by means of contextual models. Finally, we address the work of Cheng and Novick dealing with causal attribution by human adults. Their <i>probabilistic contrast model</i> makes use of contextual information, called focal sets, that must be determined by a human expert. We suggest a method for discovering complete <i>focal sets</i> from probabilistic distributions, without the human expert.
|
2 |
Computation of context as a cognitive toolSanscartier, Manon Johanne 09 November 2006 (has links)
In the field of cognitive science, as well as the area of Artificial Intelligence (AI), the role of context has been investigated in many forms, and for many purposes. It is clear in both areas that consideration of contextual information is important. However, the significance of context has not been emphasized in the Bayesian networks literature. We suggest that consideration of context is necessary for acquiring knowledge about a situation and for refining current representational models that are potentially erroneous due to hidden independencies in the data.<p>In this thesis, we make several contributions towards the automation of contextual consideration by discovering useful contexts from probability distributions. We show how context-specific independencies in Bayesian networks and discovery algorithms, traditionally used for efficient probabilistic inference can contribute to the identification of contexts, and in turn can provide insight on otherwise puzzling situations. Also, consideration of context can help clarify otherwise counter intuitive puzzles, such as those that result in instances of Simpson's paradox. In the social sciences, the branch of attribution theory is context-sensitive. We suggest a method to distinguish between <i>dispositional causes</i> and <i>situational factors</i> by means of contextual models. Finally, we address the work of Cheng and Novick dealing with causal attribution by human adults. Their <i>probabilistic contrast model</i> makes use of contextual information, called focal sets, that must be determined by a human expert. We suggest a method for discovering complete <i>focal sets</i> from probabilistic distributions, without the human expert.
|
3 |
Surface Realization Using a Featurized Syntactic Statistical Language ModelPacker, Thomas L. 13 March 2006 (has links)
An important challenge in natural language surface realization is the generation of grammatical sentences from incomplete sentence plans. Realization can be broken into a two-stage process consisting of an over-generating rule-based module followed by a ranker that outputs the most probable candidate sentence based on a statistical language model. Thus far, an n-gram language model has been evaluated in this context. More sophisticated syntactic knowledge is expected to improve such a ranker. In this thesis, a new language model based on featurized functional dependency syntax was developed and evaluated. Generation accuracies and cross-entropy for the new language model did not beat the comparison bigram language model.
|
Page generated in 0.1009 seconds