Sentence Bottleneck Autoencoders from Transformer Language Models

Ivan Montero, Nikolaos Pappas, Noah A. Smith


Abstract
Representation learning for text via pretraining a language model on a large corpus has become a standard starting point for building NLP systems. This approach stands in contrast to autoencoders, also trained on raw text, but with the objective of learning to encode each input as a vector that allows full reconstruction. Autoencoders are attractive because of their latent space structure and generative properties. We therefore explore the construction of a sentence-level autoencoder from a pretrained, frozen transformer language model. We adapt the masked language modeling objective as a generative, denoising one, while only training a sentence bottleneck and a single-layer modified transformer decoder. We demonstrate that the sentence representations discovered by our model achieve better quality than previous methods that extract representations from pretrained transformers on text similarity tasks, style transfer (an example of controlled generation), and single-sentence classification tasks in the GLUE benchmark, while using fewer parameters than large pretrained models.
Anthology ID:
2021.emnlp-main.137
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1822–1831
Language:
URL:
https://aclanthology.org/2021.emnlp-main.137
DOI:
10.18653/v1/2021.emnlp-main.137
Bibkey:
Cite (ACL):
Ivan Montero, Nikolaos Pappas, and Noah A. Smith. 2021. Sentence Bottleneck Autoencoders from Transformer Language Models. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 1822–1831, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Sentence Bottleneck Autoencoders from Transformer Language Models (Montero et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.137.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.137.mp4
Code
 ivanmontero/autobot
Data
GLUE