Bag-of-Words Transfer: Non-Contextual Techniques for Multi-Task Learning

Seth Ebner, Felicity Wang, Benjamin Van Durme


Abstract
Many architectures for multi-task learning (MTL) have been proposed to take advantage of transfer among tasks, often involving complex models and training procedures. In this paper, we ask if the sentence-level representations learned in previous approaches provide significant benefit beyond that provided by simply improving word-based representations. To investigate this question, we consider three techniques that ignore sequence information: a syntactically-oblivious pooling encoder, pre-trained non-contextual word embeddings, and unigram generative regularization. Compared to a state-of-the-art MTL approach to textual inference, the simple techniques we use yield similar performance on a universe of task combinations while reducing training time and model size.
Anthology ID:
D19-6105
Volume:
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Colin Cherry, Greg Durrett, George Foster, Reza Haffari, Shahram Khadivi, Nanyun Peng, Xiang Ren, Swabha Swayamdipta
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
40–46
Language:
URL:
https://aclanthology.org/D19-6105
DOI:
10.18653/v1/D19-6105
Bibkey:
Cite (ACL):
Seth Ebner, Felicity Wang, and Benjamin Van Durme. 2019. Bag-of-Words Transfer: Non-Contextual Techniques for Multi-Task Learning. In Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019), pages 40–46, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Bag-of-Words Transfer: Non-Contextual Techniques for Multi-Task Learning (Ebner et al., 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-6105.pdf