Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement

Jason Lee, Elman Mansimov, Kyunghyun Cho


Abstract
We propose a conditional non-autoregressive neural sequence model based on iterative refinement. The proposed model is designed based on the principles of latent variable models and denoising autoencoders, and is generally applicable to any sequence generation task. We extensively evaluate the proposed model on machine translation (En-De and En-Ro) and image caption generation, and observe that it significantly speeds up decoding while maintaining the generation quality comparable to the autoregressive counterpart.
Anthology ID:
D18-1149
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1173–1182
Language:
URL:
https://aclanthology.org/D18-1149
DOI:
10.18653/v1/D18-1149
Bibkey:
Cite (ACL):
Jason Lee, Elman Mansimov, and Kyunghyun Cho. 2018. Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1173–1182, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement (Lee et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1149.pdf
Attachment:
 D18-1149.Attachment.pdf
Video:
 https://aclanthology.org/D18-1149.mp4
Code
 nyu-dl/dl4mt-nonauto +  additional community code
Data
MS COCOWMT 2014WMT 2016WMT 2016 News