Morphosyntactic Tagging with a Meta-BiLSTM Model over Context Sensitive Token Encodings

Bernd Bohnet, Ryan McDonald, Gonçalo Simões, Daniel Andor, Emily Pitler, Joshua Maynez


Abstract
The rise of neural networks, and particularly recurrent neural networks, has produced significant advances in part-of-speech tagging accuracy. One characteristic common among these models is the presence of rich initial word encodings. These encodings typically are composed of a recurrent character-based representation with dynamically and pre-trained word embeddings. However, these encodings do not consider a context wider than a single word and it is only through subsequent recurrent layers that word or sub-word information interacts. In this paper, we investigate models that use recurrent neural networks with sentence-level context for initial character and word-based representations. In particular we show that optimal results are obtained by integrating these context sensitive representations through synchronized training with a meta-model that learns to combine their states.
Anthology ID:
P18-1246
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2642–2652
Language:
URL:
https://aclanthology.org/P18-1246
DOI:
10.18653/v1/P18-1246
Bibkey:
Cite (ACL):
Bernd Bohnet, Ryan McDonald, Gonçalo Simões, Daniel Andor, Emily Pitler, and Joshua Maynez. 2018. Morphosyntactic Tagging with a Meta-BiLSTM Model over Context Sensitive Token Encodings. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2642–2652, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Morphosyntactic Tagging with a Meta-BiLSTM Model over Context Sensitive Token Encodings (Bohnet et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-1246.pdf
Poster:
 P18-1246.Poster.pdf
Code
 additional community code
Data
Penn Treebank