Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding

Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, Virginia de Sa


Abstract
We propose an asymmetric encoder-decoder structure, which keeps an RNN as the encoder and has a CNN as the decoder, and the model only explores the subsequent context information as the supervision. The asymmetry in both model architecture and training pair reduces a large amount of the training time. The contribution of our work is summarized as 1. We design experiments to show that an autoregressive decoder or an RNN decoder is not necessary for the encoder-decoder type of models in terms of learning sentence representations, and based on our results, we present 2 findings. 2. The two interesting findings lead to our final model design, which has an RNN encoder and a CNN decoder, and it learns to encode the current sentence and decode the subsequent contiguous words all at once. 3. With a suite of techniques, our model performs good on downstream tasks and can be trained efficiently on a large unlabelled corpus.
Anthology ID:
W18-3009
Volume:
Proceedings of the Third Workshop on Representation Learning for NLP
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Isabelle Augenstein, Kris Cao, He He, Felix Hill, Spandana Gella, Jamie Kiros, Hongyuan Mei, Dipendra Misra
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
69–78
Language:
URL:
https://aclanthology.org/W18-3009
DOI:
10.18653/v1/W18-3009
Bibkey:
Cite (ACL):
Shuai Tang, Hailin Jin, Chen Fang, Zhaowen Wang, and Virginia de Sa. 2018. Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding. In Proceedings of the Third Workshop on Representation Learning for NLP, pages 69–78, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Speeding up Context-based Sentence Representation Learning with Non-autoregressive Convolutional Decoding (Tang et al., RepL4NLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-3009.pdf
Notes:
 W18-3009.Notes.pdf
Data
MPQA Opinion CorpusSICKSNLISST