Ruminating Reader: Reasoning with Gated Multi-hop Attention

Yichen Gong, Samuel Bowman


Abstract
To answer the question in machine comprehension (MC) task, the models need to establish the interaction between the question and the context. To tackle the problem that the single-pass model cannot reflect on and correct its answer, we present Ruminating Reader. Ruminating Reader adds a second pass of attention and a novel information fusion component to the Bi-Directional Attention Flow model (BiDAF). We propose novel layer structures that construct a query aware context vector representation and fuse encoding representation with intermediate representation on top of BiDAF model. We show that a multi-hop attention mechanism can be applied to a bi-directional attention structure. In experiments on SQuAD, we find that the Reader outperforms the BiDAF baseline by 2.1 F1 score and 2.7 EM score. Our analysis shows that different hops of the attention have different responsibilities in selecting answers.
Anthology ID:
W18-2601
Volume:
Proceedings of the Workshop on Machine Reading for Question Answering
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Eunsol Choi, Minjoon Seo, Danqi Chen, Robin Jia, Jonathan Berant
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–11
Language:
URL:
https://aclanthology.org/W18-2601
DOI:
10.18653/v1/W18-2601
Bibkey:
Cite (ACL):
Yichen Gong and Samuel Bowman. 2018. Ruminating Reader: Reasoning with Gated Multi-hop Attention. In Proceedings of the Workshop on Machine Reading for Question Answering, pages 1–11, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Ruminating Reader: Reasoning with Gated Multi-hop Attention (Gong & Bowman, ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-2601.pdf
Data
SQuAD