Sequential Cross-Document Coreference Resolution

Emily Allaway, Shuai Wang, Miguel Ballesteros


Abstract
Relating entities and events in text is a key component of natural language understanding. Cross-document coreference resolution, in particular, is important for the growing interest in multi-document analysis tasks. In this work we propose a new model that extends the efficient sequential prediction paradigm for coreference resolution to cross-document settings and achieves competitive results for both entity and event coreference while providing strong evidence of the efficacy of both sequential models and higher-order inference in cross-document settings. Our model incrementally composes mentions into cluster representations and predicts links between a mention and the already constructed clusters, approximating a higher-order model. In addition, we conduct extensive ablation studies that provide new insights into the importance of various inputs and representation types in coreference.
Anthology ID:
2021.emnlp-main.382
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4659–4671
Language:
URL:
https://aclanthology.org/2021.emnlp-main.382
DOI:
10.18653/v1/2021.emnlp-main.382
Bibkey:
Cite (ACL):
Emily Allaway, Shuai Wang, and Miguel Ballesteros. 2021. Sequential Cross-Document Coreference Resolution. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 4659–4671, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Sequential Cross-Document Coreference Resolution (Allaway et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.382.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.382.mp4
Data
ECB+