July 19, 2018 In this thesis we present an investigation of multi-task and transfer learning using the recently introduced task of semantic tagging. First we employ a number of natural language processing tasks as auxiliaries for semantic tag- ging. Secondly, going in the other direction, we employ seman- tic tagging as an auxiliary task for three di erent NLP tasks: Part-of-Speech Tagging, Universal Dependency parsing, and Natural Language Inference. We compare full neural network sharing, partial neural network sharing, and what we term the learning what to share setting where neg- ative transfer between tasks is less likely. Fi- nally, we investigate multi-lingual learning framed as a special case of multi-task learning. Our ndings show considerable improvements for most experiments, demonstrating a variety of cases where multi-task and transfer learning methods are bene cial. 1 References 2
Identifer | oai:union.ndltd.org:nusl.cz/oai:invenio.nusl.cz:387848 |
Date | January 2018 |
Creators | Abdou, Mostafa |
Contributors | Vidová Hladká, Barbora, Libovický, Jindřich |
Source Sets | Czech ETDs |
Language | English |
Detected Language | English |
Type | info:eu-repo/semantics/masterThesis |
Rights | info:eu-repo/semantics/restrictedAccess |
Page generated in 0.0026 seconds