Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation

Jie Zhou, Ying Cao, Xuguang Wang, Peng Li, Wei Xu


Abstract
Neural machine translation (NMT) aims at solving machine translation (MT) problems using neural networks and has exhibited promising results in recent years. However, most of the existing NMT models are shallow and there is still a performance gap between a single NMT model and the best conventional MT system. In this work, we introduce a new type of linear connections, named fast-forward connections, based on deep Long Short-Term Memory (LSTM) networks, and an interleaved bi-directional architecture for stacking the LSTM layers. Fast-forward connections play an essential role in propagating the gradients and building a deep topology of depth 16. On the WMT’14 English-to-French task, we achieve BLEU=37.7 with a single attention model, which outperforms the corresponding single shallow model by 6.2 BLEU points. This is the first time that a single NMT model achieves state-of-the-art performance and outperforms the best conventional model by 0.7 BLEU points. We can still achieve BLEU=36.3 even without using an attention mechanism. After special handling of unknown words and model ensembling, we obtain the best score reported to date on this task with BLEU=40.4. Our models are also validated on the more difficult WMT’14 English-to-German task.
Anthology ID:
Q16-1027
Volume:
Transactions of the Association for Computational Linguistics, Volume 4
Month:
Year:
2016
Address:
Cambridge, MA
Editors:
Lillian Lee, Mark Johnson, Kristina Toutanova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
371–383
Language:
URL:
https://aclanthology.org/Q16-1027
DOI:
10.1162/tacl_a_00105
Bibkey:
Cite (ACL):
Jie Zhou, Ying Cao, Xuguang Wang, Peng Li, and Wei Xu. 2016. Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation. Transactions of the Association for Computational Linguistics, 4:371–383.
Cite (Informal):
Deep Recurrent Models with Fast-Forward Connections for Neural Machine Translation (Zhou et al., TACL 2016)
Copy Citation:
PDF:
https://aclanthology.org/Q16-1027.pdf
Video:
 https://aclanthology.org/Q16-1027.mp4
Data
EuroparlWMT 2014