Continual Lifelong Learning in Natural Language Processing: A Survey

Magdalena Biesialska, Katarzyna Biesialska, Marta R. Costa-jussà


Abstract
Continual learning (CL) aims to enable information systems to learn from a continuous data stream across time. However, it is difficult for existing deep learning architectures to learn a new task without largely forgetting previously acquired knowledge. Furthermore, CL is particularly challenging for language learning, as natural language is ambiguous: it is discrete, compositional, and its meaning is context-dependent. In this work, we look at the problem of CL through the lens of various NLP tasks. Our survey discusses major challenges in CL and current methods applied in neural network models. We also provide a critical review of the existing CL evaluation methods and datasets in NLP. Finally, we present our outlook on future research directions.
Anthology ID:
2020.coling-main.574
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6523–6541
Language:
URL:
https://aclanthology.org/2020.coling-main.574
DOI:
10.18653/v1/2020.coling-main.574
Bibkey:
Cite (ACL):
Magdalena Biesialska, Katarzyna Biesialska, and Marta R. Costa-jussà. 2020. Continual Lifelong Learning in Natural Language Processing: A Survey. In Proceedings of the 28th International Conference on Computational Linguistics, pages 6523–6541, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Continual Lifelong Learning in Natural Language Processing: A Survey (Biesialska et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.574.pdf
Data
GLUEdecaNLP