CUNI Transformer Neural MT System for WMT18

Martin Popel


Abstract
We describe our NMT system submitted to the WMT2018 shared task in news translation. Our system is based on the Transformer model (Vaswani et al., 2017). We use an improved technique of backtranslation, where we iterate the process of translating monolingual data in one direction and training an NMT model for the opposite direction using synthetic parallel data. We apply a simple but effective filtering of the synthetic data. We pre-process the input sentences using coreference resolution in order to disambiguate the gender of pro-dropped personal pronouns. Finally, we apply two simple post-processing substitutions on the translated output. Our system is significantly (p < 0.05) better than all other English-Czech and Czech-English systems in WMT2018.
Anthology ID:
W18-6424
Volume:
Proceedings of the Third Conference on Machine Translation: Shared Task Papers
Month:
October
Year:
2018
Address:
Belgium, Brussels
Editors:
Ondřej Bojar, Rajen Chatterjee, Christian Federmann, Mark Fishel, Yvette Graham, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Christof Monz, Matteo Negri, Aurélie Névéol, Mariana Neves, Matt Post, Lucia Specia, Marco Turchi, Karin Verspoor
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
482–487
Language:
URL:
https://aclanthology.org/W18-6424
DOI:
10.18653/v1/W18-6424
Bibkey:
Cite (ACL):
Martin Popel. 2018. CUNI Transformer Neural MT System for WMT18. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers, pages 482–487, Belgium, Brussels. Association for Computational Linguistics.
Cite (Informal):
CUNI Transformer Neural MT System for WMT18 (Popel, WMT 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-6424.pdf