Return to search

Commonsense Knowledge Representation and Reasoning in Statistical Script Learning

<div>
<div>
<div>
<div>
<p>A recent surge of research on commonsense knowledge has given the AI community new
opportunities and challenges. Many studies focus on constructing commonsense knowledge
representations from natural language data. However, how to learn such representations from
large-scale text data is still an open question. This thesis addresses the problem through
statistical script learning, which learns event representations from stereotypical event relationships using weak supervision. These event representations serve as an abundant source
of commonsense knowledge to be applied in downstream language tasks. We propose three
script learning models that generalize previous works with new insight. A feature-enriched
model characterizes fine-grained and entity-based event properties to address specific semantics. A multi-relational model generalizes traditional script learning models which rely on
one type of event relationship—co-occurrence—to a multi-relational model that considers
typed event relationships, going beyond simple event similarities. A narrative graph model
leverages a narrative graph to inform an event with a grounded situation to maintain a
global consistency of event states. Also, pretrained language models such as BERT are used
to further improve event semantics.</p><p>Our three script learning models do not rely on annotated datasets, as the cost of creating
these at large scales is unreasonable. Based on weak supervision, we extract events from
large collections of textual data. Although noisy, the learned event representations carry
profound commonsense information, enhancing performance in downstream language tasks.</p>
<p>We evaluate their performance with various intrinsic and extrinsic evaluations. In the
intrinsic evaluations, although the three models are evaluated in terms of various aspects,
the shared core task is Multiple Choice Narrative Cloze (MCNC), which measures the
model’s ability to predict what happens next, out of five candidate events, in a given situation. This task facilitates fair comparisons between script learning models for commonsense
inference. The three models were proposed in three consecutive years, from 2018 to 2020,
each outperforming the previous year’s model as well as the competitors’ baselines. Our
best model outperforms EventComp, a widely recognized baseline, by a large margin in
MCNC: i.e., absolute accuracy improvements of 9.73% (53.86% → 63.59%). In the extrinsic evaluations, we use our models for implicit discourse sense classification (IDSC), a challenging task in which two argument spans are annotated with an implicit discourse sense; the
task is to predict the sense type, which requires a deep understanding of common sense between discourse arguments. Moreover, in an additional work we touch on a more interesting
group of tasks about psychological commonsense reasoning. Solving these requires reasoning
about and understanding human mental states such as motivation, emotion, and desire. Our
best model, an enhancement of the narrative graph model, combines the advantages of the
above three works to address entity-based features, typed event relationships, and grounded
context in one model. The model successfully captures the context in which events appear
and interactions between characters’ mental states, outperforming previous works.</p>



<div>
<div>
<div>

<p>The main contributions of this thesis are as follows: (1) We identify the importance of entity-based features for representing commonsense knowledge with script learning. (2) We create one of the first, if not the first, script learning models that addresses the
multi-relational nature between events. (3) We publicly release contextualized event representations (models) trained on large-scale newswire data. (4) We develop a script learning model that combines entity-based features, typed event
relationships, and grounded context in one model, and show that it is a good fit for
modeling psychological common sense.</p><p>To conclude, this thesis presents an in-depth exploration of statistical script learning,
enhancing existing models with new insight. Our experimental results show that models
informed with the new knowledge aspects significantly outperform previous works in both
intrinsic and extrinsic evaluations. </p>
</div>
</div>
</div>
</div>
</div>
</div>
</div>

  1. 10.25394/pgs.13339205.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/13339205
Date15 December 2020
CreatorsI-Ta Lee (9736907)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/Commonsense_Knowledge_Representation_and_Reasoning_in_Statistical_Script_Learning/13339205

Page generated in 0.0023 seconds