Zero-Resource Translation with Multi-Lingual Neural Machine Translation

Orhan Firat1, Baskaran Sankaran2, Yaser Al-Onaizan2, Fatos T. Yarman Vural1, Kyunghyun Cho3
1Middle East Technical University, 2IBM, 3New York University


Abstract

In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, multilingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.