Spelling suggestions: "subject:"then N best problem"" "subject:"them N best problem""
1 |
Towards semantic language processing / Mot semantisk språkbearbetningJonsson, Anna January 2018 (has links)
The overall goal of the field of natural language processing is to facilitate the communication between humans and computers, and to help humans with natural language problems such as translation. In this thesis, we focus on semantic language processing. Modelling semantics – the meaning of natural language – requires both a structure to hold the semantic information and a device that can enforce rules on the structure to ensure well-formed semantics while not being too computationally heavy. The devices used in natural language processing are preferably weighted to allow for comparison of the alternative semantic interpretations outputted by a device. The structure employed here is the abstract meaning representation (AMR). We show that AMRs representing well-formed semantics can be generated while leaving out AMRs that are not semantically well-formed. For this purpose, we use a type of graph grammar called contextual hyperedge replacement grammar (CHRG). Moreover, we argue that a more well-known subclass of CHRG – the hyperedge replacement grammar (HRG) – is not powerful enough for AMR generation. This is due to the limitation of HRG when it comes to handling co-references, which in its turn depends on the fact that HRGs only generate graphs of bounded treewidth. Furthermore, we also address the N best problem, which is as follows: Given a weighted device, return the N best (here: smallest-weighted, or more intuitively, smallest-errored) structures. Our goal is to solve the N best problem for devices capable of expressing sophisticated forms of semantic representations such as CHRGs. Here, however, we merely take a first step consisting in developing methods for solving the N best problem for weighted tree automata and some types of weighted acyclic hypergraphs.
|
Page generated in 0.0828 seconds