NICT‘s Submission To WAT 2020: How Effective Are Simple Many-To-Many Neural Machine Translation Models?

Raj Dabre, Abhisek Chakrabarty


Abstract
In this paper we describe our team‘s (NICT-5) Neural Machine Translation (NMT) models whose translations were submitted to shared tasks of the 7th Workshop on Asian Translation. We participated in the Indic language multilingual sub-task as well as the NICT-SAP multilingual multi-domain sub-task. We focused on naive many-to-many NMT models which gave reasonable translation quality despite their simplicity. Our observations are twofold: (a.) Many-to-many models suffer from a lack of consistency where the translation quality for some language pairs is very good but for some others it is terrible when compared against one-to-many and many-to-one baselines. (b.) Oversampling smaller corpora does not necessarily give the best translation quality for the language pair associated with that pair.
Anthology ID:
2020.wat-1.9
Volume:
Proceedings of the 7th Workshop on Asian Translation
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Toshiaki Nakazawa, Hideki Nakayama, Chenchen Ding, Raj Dabre, Anoop Kunchukuttan, Win Pa Pa, Ondřej Bojar, Shantipriya Parida, Isao Goto, Hidaya Mino, Hiroshi Manabe, Katsuhito Sudoh, Sadao Kurohashi, Pushpak Bhattacharyya
Venue:
WAT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
98–102
Language:
URL:
https://aclanthology.org/2020.wat-1.9
DOI:
Bibkey:
Cite (ACL):
Raj Dabre and Abhisek Chakrabarty. 2020. NICT‘s Submission To WAT 2020: How Effective Are Simple Many-To-Many Neural Machine Translation Models?. In Proceedings of the 7th Workshop on Asian Translation, pages 98–102, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
NICT‘s Submission To WAT 2020: How Effective Are Simple Many-To-Many Neural Machine Translation Models? (Dabre & Chakrabarty, WAT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wat-1.9.pdf