Return to search

Few-shot Question Generation with Prompt-based Learning

Question generation (QG), which automatically generates good-quality questions from a piece of text, is capable of lowering the cost of the manual composition of questions. Recently Question generation has attracted increasing interest for its ability to supply a large number of questions for developing conversation systems and educational applications, as well as corpus development for natural language processing (NLP) research tasks, such as question answering and reading comprehension. Previous neural-based QG approaches have achieved remarkable performance. In contrast, these approaches require a large amount of data to train neural models properly, limiting the application of question generation in low-resource scenarios, e.g. with a few hundred training examples. This thesis aims to address the problem of the low-resource scenario by investigating a recently emerged paradigm of NLP modelling, prompt-based learning. Prompt-based learning, which makes predictions based on the knowledge of the pre-trained language model and some simple textual task descriptions, has shown great effectiveness in various NLP tasks in few-shot and zero-shot settings, in which a few or non-examples are needed to train a model. In this project, we have introduced a prompt-based question generation approach by constructing question generation task instructions that are understandable by a pre-trained sequence-to-sequence language model. Our experiment results show that our approach outperforms previous state-of-the-art question generation models with a vast margin of 36.8%, 204.8%, 455.9%, 1083.3%, 57.9% for metrics BLEU-1, BLEU-2, BLEU-3, BLEU-4, and ROUGE-L respectively in the few-shot learning settings. We also conducted a quality analysis of the generated questions and found that our approach can generate questions with correct grammar and relevant topical information when training with as few as 1,000 training examples.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:uu-484923
Date January 2022
CreatorsWu, Yongchao
PublisherUppsala universitet, Institutionen för lingvistik och filologi
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0019 seconds