Attending to Long-Distance Document Context for Sequence Labeling

Matthew Jörke, Jon Gillick, Matthew Sims, David Bamman


Abstract
We present in this work a method for incorporating global context in long documents when making local decisions in sequence labeling problems like NER. Inspired by work in featurized log-linear models (Chieu and Ng, 2002; Sutton and McCallum, 2004), our model learns to attend to multiple mentions of the same word type in generating a representation for each token in context, extending that work to learning representations that can be incorporated into modern neural models. Attending to broader context at test time provides complementary information to pretraining (Gururangan et al., 2020), yields strong gains over equivalently parameterized models lacking such context, and performs best at recognizing entities with high TF-IDF scores (i.e., those that are important within a document).
Anthology ID:
2020.findings-emnlp.330
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3692–3704
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.330
DOI:
10.18653/v1/2020.findings-emnlp.330
Bibkey:
Cite (ACL):
Matthew Jörke, Jon Gillick, Matthew Sims, and David Bamman. 2020. Attending to Long-Distance Document Context for Sequence Labeling. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3692–3704, Online. Association for Computational Linguistics.
Cite (Informal):
Attending to Long-Distance Document Context for Sequence Labeling (Jörke et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.330.pdf
Code
 mjoerke/doc-arc
Data
GENIALitBankOntoNotes 5.0