Multi-Cell Compositional LSTM for NER Domain Adaptation

Chen Jia, Yue Zhang


Abstract
Cross-domain NER is a challenging yet practical problem. Entity mentions can be highly different across domains. However, the correlations between entity types can be relatively more stable across domains. We investigate a multi-cell compositional LSTM structure for multi-task learning, modeling each entity type using a separate cell state. With the help of entity typed units, cross-domain knowledge transfer can be made in an entity type level. Theoretically, the resulting distinct feature distributions for each entity type make it more powerful for cross-domain transfer. Empirically, experiments on four few-shot and zero-shot datasets show our method significantly outperforms a series of multi-task learning methods and achieves the best results.
Anthology ID:
2020.acl-main.524
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5906–5917
Language:
URL:
https://aclanthology.org/2020.acl-main.524
DOI:
10.18653/v1/2020.acl-main.524
Bibkey:
Cite (ACL):
Chen Jia and Yue Zhang. 2020. Multi-Cell Compositional LSTM for NER Domain Adaptation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5906–5917, Online. Association for Computational Linguistics.
Cite (Informal):
Multi-Cell Compositional LSTM for NER Domain Adaptation (Jia & Zhang, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.524.pdf
Video:
 http://slideslive.com/38929285