Document Level NMT of Low-Resource Languages with Backtranslation

Sami Ul Haq, Sadaf Abdul Rauf, Arsalan Shaukat, Abdullah Saeed


Abstract
This paper describes our system submission to WMT20 shared task on similar language translation. We examined the use of documentlevel neural machine translation (NMT) systems for low-resource, similar language pair Marathi−Hindi. Our system is an extension of state-of-the-art Transformer architecture with hierarchical attention networks to incorporate contextual information. Since, NMT requires large amount of parallel data which is not available for this task, our approach is focused on utilizing monolingual data with back translation to train our models. Our experiments reveal that document-level NMT can be a reasonable alternative to sentence-level NMT for improving translation quality of low resourced languages even when used with synthetic data.
Anthology ID:
2020.wmt-1.53
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
442–446
Language:
URL:
https://aclanthology.org/2020.wmt-1.53
DOI:
Bibkey:
Cite (ACL):
Sami Ul Haq, Sadaf Abdul Rauf, Arsalan Shaukat, and Abdullah Saeed. 2020. Document Level NMT of Low-Resource Languages with Backtranslation. In Proceedings of the Fifth Conference on Machine Translation, pages 442–446, Online. Association for Computational Linguistics.
Cite (Informal):
Document Level NMT of Low-Resource Languages with Backtranslation (Ul Haq et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.53.pdf
Video:
 https://slideslive.com/38939608