Neural Machine Translation with Reordering Embeddings

Kehai Chen, Rui Wang, Masao Utiyama, Eiichiro Sumita


Abstract
The reordering model plays an important role in phrase-based statistical machine translation. However, there are few works that exploit the reordering information in neural machine translation. In this paper, we propose a reordering mechanism to learn the reordering embedding of a word based on its contextual information. These learned reordering embeddings are stacked together with self-attention networks to learn sentence representation for machine translation. The reordering mechanism can be easily integrated into both the encoder and the decoder in the Transformer translation system. Experimental results on WMT’14 English-to-German, NIST Chinese-to-English, and WAT Japanese-to-English translation tasks demonstrate that the proposed methods can significantly improve the performance of the Transformer.
Anthology ID:
P19-1174
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1787–1799
Language:
URL:
https://aclanthology.org/P19-1174
DOI:
10.18653/v1/P19-1174
Bibkey:
Cite (ACL):
Kehai Chen, Rui Wang, Masao Utiyama, and Eiichiro Sumita. 2019. Neural Machine Translation with Reordering Embeddings. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1787–1799, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Neural Machine Translation with Reordering Embeddings (Chen et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1174.pdf
Video:
 https://aclanthology.org/P19-1174.mp4
Data
ASPEC