Integrating Order Information and Event Relation for Script Event Prediction

Zhongqing Wang, Yue Zhang, Ching-Yun Chang


Abstract
There has been a recent line of work automatically learning scripts from unstructured texts, by modeling narrative event chains. While the dominant approach group events using event pair relations, LSTMs have been used to encode full chains of narrative events. The latter has the advantage of learning long-range temporal orders, yet the former is more adaptive to partial orders. We propose a neural model that leverages the advantages of both methods, by using LSTM hidden states as features for event pair modelling. A dynamic memory network is utilized to automatically induce weights on existing events for inferring a subsequent event. Standard evaluation shows that our method significantly outperforms both methods above, giving the best results reported so far.
Anthology ID:
D17-1006
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
57–67
Language:
URL:
https://aclanthology.org/D17-1006
DOI:
10.18653/v1/D17-1006
Bibkey:
Cite (ACL):
Zhongqing Wang, Yue Zhang, and Ching-Yun Chang. 2017. Integrating Order Information and Event Relation for Script Event Prediction. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 57–67, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Integrating Order Information and Event Relation for Script Event Prediction (Wang et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1006.pdf
Video:
 https://aclanthology.org/D17-1006.mp4