Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism

Longyue Wang, Zhaopeng Tu, Andy Way, Qun Liu


Abstract
Pronouns are frequently omitted in pro-drop languages, such as Chinese, generally leading to significant challenges with respect to the production of complete translations. Recently, Wang et al. (2018) proposed a novel reconstruction-based approach to alleviating dropped pronoun (DP) translation problems for neural machine translation models. In this work, we improve the original model from two perspectives. First, we employ a shared reconstructor to better exploit encoder and decoder representations. Second, we jointly learn to translate and predict DPs in an end-to-end manner, to avoid the errors propagated from an external DP prediction model. Experimental results show that our approach significantly improves both translation performance and DP prediction accuracy.
Anthology ID:
D18-1333
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2997–3002
Language:
URL:
https://aclanthology.org/D18-1333
DOI:
10.18653/v1/D18-1333
Bibkey:
Cite (ACL):
Longyue Wang, Zhaopeng Tu, Andy Way, and Qun Liu. 2018. Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2997–3002, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Learning to Jointly Translate and Predict Dropped Pronouns with a Shared Reconstruction Mechanism (Wang et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1333.pdf