Multi-glance Reading Model for Text Understanding

Pengcheng Zhu, Yujiu Yang, Wenqiang Gao, Yi Liu


Abstract
In recent years, a variety of recurrent neural networks have been proposed, e.g LSTM. However, existing models only read the text once, it cannot describe the situation of repeated reading in reading comprehension. In fact, when reading or analyzing a text, we may read the text several times rather than once if we couldn’t well understand it. So, how to model this kind of the reading behavior? To address the issue, we propose a multi-glance mechanism (MGM) for modeling the habit of reading behavior. In the proposed framework, the actual reading process can be fully simulated, and then the obtained information can be consistent with the task. Based on the multi-glance mechanism, we design two types of recurrent neural network models for repeated reading: Glance Cell Model (GCM) and Glance Gate Model (GGM). Visualization analysis of the GCM and the GGM demonstrates the effectiveness of multi-glance mechanisms. Experiments results on the large-scale datasets show that the proposed methods can achieve better performance.
Anthology ID:
W18-2804
Volume:
Proceedings of the Eight Workshop on Cognitive Aspects of Computational Language Learning and Processing
Month:
July
Year:
2018
Address:
Melbourne
Editors:
Marco Idiart, Alessandro Lenci, Thierry Poibeau, Aline Villavicencio
Venue:
CogACLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
27–35
Language:
URL:
https://aclanthology.org/W18-2804
DOI:
10.18653/v1/W18-2804
Bibkey:
Cite (ACL):
Pengcheng Zhu, Yujiu Yang, Wenqiang Gao, and Yi Liu. 2018. Multi-glance Reading Model for Text Understanding. In Proceedings of the Eight Workshop on Cognitive Aspects of Computational Language Learning and Processing, pages 27–35, Melbourne. Association for Computational Linguistics.
Cite (Informal):
Multi-glance Reading Model for Text Understanding (Zhu et al., CogACLL 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-2804.pdf