Incorporating a Local Translation Mechanism into Non-autoregressive Translation

Xiang Kong, Zhisong Zhang, Eduard Hovy


Abstract
In this work, we introduce a novel local autoregressive translation (LAT) mechanism into non-autoregressive translation (NAT) models so as to capture local dependencies among target outputs. Specifically, for each target decoding position, instead of only one token, we predict a short sequence of tokens in an autoregressive way. We further design an efficient merging algorithm to align and merge the output pieces into one final output sequence. We integrate LAT into the conditional masked language model (CMLM) (Ghazvininejad et al.,2019) and similarly adopt iterative decoding. Empirical results on five translation tasks show that compared with CMLM, our method achieves comparable or better performance with fewer decoding iterations, bringing a 2.5x speedup. Further analysis indicates that our method reduces repeated translations and performs better at longer sentences. Our code will be released to the public.
Anthology ID:
2020.emnlp-main.79
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1067–1073
Language:
URL:
https://aclanthology.org/2020.emnlp-main.79
DOI:
10.18653/v1/2020.emnlp-main.79
Bibkey:
Cite (ACL):
Xiang Kong, Zhisong Zhang, and Eduard Hovy. 2020. Incorporating a Local Translation Mechanism into Non-autoregressive Translation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1067–1073, Online. Association for Computational Linguistics.
Cite (Informal):
Incorporating a Local Translation Mechanism into Non-autoregressive Translation (Kong et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.79.pdf
Video:
 https://slideslive.com/38939050
Code
 shawnkx/NAT-with-Local-AT
Data
WMT 2014WMT 2016WMT 2016 News