Neural Generation of Diverse Questions using Answer Focus, Contextual and Linguistic Features

Vrindavan Harrison, Marilyn Walker


Abstract
Question Generation is the task of automatically creating questions from textual input. In this work we present a new Attentional Encoder–Decoder Recurrent Neural Network model for automatic question generation. Our model incorporates linguistic features and an additional sentence embedding to capture meaning at both sentence and word levels. The linguistic features are designed to capture information related to named entity recognition, word case, and entity coreference resolution. In addition our model uses a copying mechanism and a special answer signal that enables generation of numerous diverse questions on a given sentence. Our model achieves state of the art results of 19.98 Bleu_4 on a benchmark Question Generation dataset, outperforming all previously published results by a significant margin. A human evaluation also shows that the added features improve the quality of the generated questions.
Anthology ID:
W18-6536
Volume:
Proceedings of the 11th International Conference on Natural Language Generation
Month:
November
Year:
2018
Address:
Tilburg University, The Netherlands
Editors:
Emiel Krahmer, Albert Gatt, Martijn Goudbeek
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
296–306
Language:
URL:
https://aclanthology.org/W18-6536
DOI:
10.18653/v1/W18-6536
Bibkey:
Cite (ACL):
Vrindavan Harrison and Marilyn Walker. 2018. Neural Generation of Diverse Questions using Answer Focus, Contextual and Linguistic Features. In Proceedings of the 11th International Conference on Natural Language Generation, pages 296–306, Tilburg University, The Netherlands. Association for Computational Linguistics.
Cite (Informal):
Neural Generation of Diverse Questions using Answer Focus, Contextual and Linguistic Features (Harrison & Walker, INLG 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-6536.pdf
Data
SQuAD