Incremental Neural Coreference Resolution in Constant Memory

Patrick Xia, João Sedoc, Benjamin Van Durme


Abstract
We investigate modeling coreference resolution under a fixed memory constraint by extending an incremental clustering algorithm to utilize contextualized encoders and neural components. Given a new sentence, our end-to-end algorithm proposes and scores each mention span against explicit entity representations created from the earlier document context (if any). These spans are then used to update the entity’s representations before being forgotten; we only retain a fixed set of salient entities throughout the document. In this work, we successfully convert a high-performing model (Joshi et al., 2020), asymptotically reducing its memory usage to constant space with only a 0.3% relative loss in F1 on OntoNotes 5.0.
Anthology ID:
2020.emnlp-main.695
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8617–8624
Language:
URL:
https://aclanthology.org/2020.emnlp-main.695
DOI:
10.18653/v1/2020.emnlp-main.695
Bibkey:
Cite (ACL):
Patrick Xia, João Sedoc, and Benjamin Van Durme. 2020. Incremental Neural Coreference Resolution in Constant Memory. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 8617–8624, Online. Association for Computational Linguistics.
Cite (Informal):
Incremental Neural Coreference Resolution in Constant Memory (Xia et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.695.pdf
Video:
 https://slideslive.com/38939421
Data
OntoNotes 5.0