Efficient and Robust Question Answering from Minimal Context over Documents

Sewon Min, Victor Zhong, Richard Socher, Caiming Xiong


Abstract
Neural models for question answering (QA) over documents have achieved significant performance improvements. Although effective, these models do not scale to large corpora due to their complex modeling of interactions between the document and the question. Moreover, recent work has shown that such models are sensitive to adversarial inputs. In this paper, we study the minimal context required to answer the question, and find that most questions in existing datasets can be answered with a small set of sentences. Inspired by this observation, we propose a simple sentence selector to select the minimal set of sentences to feed into the QA model. Our overall system achieves significant reductions in training (up to 15 times) and inference times (up to 13 times), with accuracy comparable to or better than the state-of-the-art on SQuAD, NewsQA, TriviaQA and SQuAD-Open. Furthermore, our experimental results and analyses show that our approach is more robust to adversarial inputs.
Anthology ID:
P18-1160
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1725–1735
Language:
URL:
https://aclanthology.org/P18-1160
DOI:
10.18653/v1/P18-1160
Bibkey:
Cite (ACL):
Sewon Min, Victor Zhong, Richard Socher, and Caiming Xiong. 2018. Efficient and Robust Question Answering from Minimal Context over Documents. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1725–1735, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Efficient and Robust Question Answering from Minimal Context over Documents (Min et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-1160.pdf
Note:
 P18-1160.Notes.pdf
Poster:
 P18-1160.Poster.pdf
Data
NewsQASQuADTriviaQA