Shared-Private Bilingual Word Embeddings for Neural Machine Translation

Xuebo Liu, Derek F. Wong, Yang Liu, Lidia S. Chao, Tong Xiao, Jingbo Zhu


Abstract
Word embedding is central to neural machine translation (NMT), which has attracted intensive research interest in recent years. In NMT, the source embedding plays the role of the entrance while the target embedding acts as the terminal. These layers occupy most of the model parameters for representation learning. Furthermore, they indirectly interface via a soft-attention mechanism, which makes them comparatively isolated. In this paper, we propose shared-private bilingual word embeddings, which give a closer relationship between the source and target embeddings, and which also reduce the number of model parameters. For similar source and target words, their embeddings tend to share a part of the features and they cooperatively learn these common representation units. Experiments on 5 language pairs belonging to 6 different language families and written in 5 different alphabets demonstrate that the proposed model provides a significant performance boost over the strong baselines with dramatically fewer model parameters.
Anthology ID:
P19-1352
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3613–3622
Language:
URL:
https://aclanthology.org/P19-1352
DOI:
10.18653/v1/P19-1352
Bibkey:
Cite (ACL):
Xuebo Liu, Derek F. Wong, Yang Liu, Lidia S. Chao, Tong Xiao, and Jingbo Zhu. 2019. Shared-Private Bilingual Word Embeddings for Neural Machine Translation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3613–3622, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Shared-Private Bilingual Word Embeddings for Neural Machine Translation (Liu et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1352.pdf
Video:
 https://aclanthology.org/P19-1352.mp4