Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine Translation

Nikolaos Pappas, Lesly Miculicich, James Henderson


Abstract
Tying the weights of the target word embeddings with the target word classifiers of neural machine translation models leads to faster training and often to better translation quality. Given the success of this parameter sharing, we investigate other forms of sharing in between no sharing and hard equality of parameters. In particular, we propose a structure-aware output layer which captures the semantic structure of the output space of words within a joint input-output embedding. The model is a generalized form of weight tying which shares parameters but allows learning a more flexible relationship with input word embeddings and allows the effective capacity of the output layer to be controlled. In addition, the model shares weights across output classifiers and translation contexts which allows it to better leverage prior knowledge about them. Our evaluation on English-to-Finnish and English-to-German datasets shows the effectiveness of the method against strong encoder-decoder baselines trained with or without weight tying.
Anthology ID:
W18-6308
Volume:
Proceedings of the Third Conference on Machine Translation: Research Papers
Month:
October
Year:
2018
Address:
Brussels, Belgium
Editors:
Ondřej Bojar, Rajen Chatterjee, Christian Federmann, Mark Fishel, Yvette Graham, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Christof Monz, Matteo Negri, Aurélie Névéol, Mariana Neves, Matt Post, Lucia Specia, Marco Turchi, Karin Verspoor
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
73–83
Language:
URL:
https://aclanthology.org/W18-6308
DOI:
10.18653/v1/W18-6308
Bibkey:
Cite (ACL):
Nikolaos Pappas, Lesly Miculicich, and James Henderson. 2018. Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine Translation. In Proceedings of the Third Conference on Machine Translation: Research Papers, pages 73–83, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine Translation (Pappas et al., WMT 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-6308.pdf
Code
 idiap/joint-embedding-nmt