A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies.

Ho-Lam Chung, Ying-Hong Chan, Yao-Chung Fan


Abstract
In this paper, we investigate the following two limitations for the existing distractor generation (DG) methods. First, the quality of the existing DG methods are still far from practical use. There are still room for DG quality improvement. Second, the existing DG designs are mainly for single distractor generation. However, for practical MCQ preparation, multiple distractors are desired. Aiming at these goals, in this paper, we present a new distractor generation scheme with multi-tasking and negative answer training strategies for effectively generating multiple distractors. The experimental results show that (1) our model advances the state-of-the-art result from 28.65 to 39.81 (BLEU 1 score) and (2) the generated multiple distractors are diverse and shows strong distracting power for multiple choice question.
Anthology ID:
2020.findings-emnlp.393
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4390–4400
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.393
DOI:
10.18653/v1/2020.findings-emnlp.393
Bibkey:
Cite (ACL):
Ho-Lam Chung, Ying-Hong Chan, and Yao-Chung Fan. 2020. A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies.. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4390–4400, Online. Association for Computational Linguistics.
Cite (Informal):
A BERT-based Distractor Generation Scheme with Multi-tasking and Negative Answer Training Strategies. (Chung et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.393.pdf
Code
 voidful/BDG
Data
RACE