Continual Learning for Sentence Representations Using Conceptors

Tianlin Liu, Lyle Ungar, João Sedoc


Abstract
Distributed representations of sentences have become ubiquitous in natural language processing tasks. In this paper, we consider a continual learning scenario for sentence representations: Given a sequence of corpora, we aim to optimize the sentence encoder with respect to the new corpus while maintaining its accuracy on the old corpora. To address this problem, we propose to initialize sentence encoders with the help of corpus-independent features, and then sequentially update sentence encoders using Boolean operations of conceptor matrices to learn corpus-dependent features. We evaluate our approach on semantic textual similarity tasks and show that our proposed sentence encoder can continually learn features from new corpora while retaining its competence on previously encountered corpora.
Anthology ID:
N19-1331
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3274–3279
Language:
URL:
https://aclanthology.org/N19-1331
DOI:
10.18653/v1/N19-1331
Bibkey:
Cite (ACL):
Tianlin Liu, Lyle Ungar, and João Sedoc. 2019. Continual Learning for Sentence Representations Using Conceptors. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 3274–3279, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Continual Learning for Sentence Representations Using Conceptors (Liu et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1331.pdf
Supplementary:
 N19-1331.Supplementary.pdf
Code
 liutianlin0121/contSentEmbed