Attention-Guided Answer Distillation for Machine Reading Comprehension

Minghao Hu, Yuxing Peng, Furu Wei, Zhen Huang, Dongsheng Li, Nan Yang, Ming Zhou


Abstract
Despite that current reading comprehension systems have achieved significant advancements, their promising performances are often obtained at the cost of making an ensemble of numerous models. Besides, existing approaches are also vulnerable to adversarial attacks. This paper tackles these problems by leveraging knowledge distillation, which aims to transfer knowledge from an ensemble model to a single model. We first demonstrate that vanilla knowledge distillation applied to answer span prediction is effective for reading comprehension systems. We then propose two novel approaches that not only penalize the prediction on confusing answers but also guide the training with alignment information distilled from the ensemble. Experiments show that our best student model has only a slight drop of 0.4% F1 on the SQuAD test set compared to the ensemble teacher, while running 12x faster during inference. It even outperforms the teacher on adversarial SQuAD datasets and NarrativeQA benchmark.
Anthology ID:
D18-1232
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2077–2086
Language:
URL:
https://aclanthology.org/D18-1232
DOI:
10.18653/v1/D18-1232
Bibkey:
Cite (ACL):
Minghao Hu, Yuxing Peng, Furu Wei, Zhen Huang, Dongsheng Li, Nan Yang, and Ming Zhou. 2018. Attention-Guided Answer Distillation for Machine Reading Comprehension. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2077–2086, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Attention-Guided Answer Distillation for Machine Reading Comprehension (Hu et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1232.pdf
Data
NarrativeQASQuAD