Sequence-to-Dependency Neural Machine Translation

Shuangzhi Wu, Dongdong Zhang, Nan Yang, Mu Li, Ming Zhou


Abstract
Nowadays a typical Neural Machine Translation (NMT) model generates translations from left to right as a linear sequence, during which latent syntactic structures of the target sentences are not explicitly concerned. Inspired by the success of using syntactic knowledge of target language for improving statistical machine translation, in this paper we propose a novel Sequence-to-Dependency Neural Machine Translation (SD-NMT) method, in which the target word sequence and its corresponding dependency structure are jointly constructed and modeled, and this structure is used as context to facilitate word generations. Experimental results show that the proposed method significantly outperforms state-of-the-art baselines on Chinese-English and Japanese-English translation tasks.
Anthology ID:
P17-1065
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
698–707
Language:
URL:
https://aclanthology.org/P17-1065
DOI:
10.18653/v1/P17-1065
Bibkey:
Cite (ACL):
Shuangzhi Wu, Dongdong Zhang, Nan Yang, Mu Li, and Ming Zhou. 2017. Sequence-to-Dependency Neural Machine Translation. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 698–707, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Sequence-to-Dependency Neural Machine Translation (Wu et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1065.pdf
Video:
 https://aclanthology.org/P17-1065.mp4
Data
ASPEC