Neural Machine Translation for Bilingually Scarce Scenarios: a Deep Multi-Task Learning Approach

Poorya Zaremoodi, Gholamreza Haffari


Abstract
Neural machine translation requires large amount of parallel training text to learn a reasonable quality translation model. This is particularly inconvenient for language pairs for which enough parallel text is not available. In this paper, we use monolingual linguistic resources in the source side to address this challenging problem based on a multi-task learning approach. More specifically, we scaffold the machine translation task on auxiliary tasks including semantic parsing, syntactic parsing, and named-entity recognition. This effectively injects semantic and/or syntactic knowledge into the translation model, which would otherwise require a large amount of training bitext to learn from. We empirically analyze and show the effectiveness of our multitask learning approach on three translation tasks: English-to-French, English-to-Farsi, and English-to-Vietnamese.
Anthology ID:
N18-1123
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1356–1365
Language:
URL:
https://aclanthology.org/N18-1123
DOI:
10.18653/v1/N18-1123
Bibkey:
Cite (ACL):
Poorya Zaremoodi and Gholamreza Haffari. 2018. Neural Machine Translation for Bilingually Scarce Scenarios: a Deep Multi-Task Learning Approach. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 1356–1365, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Neural Machine Translation for Bilingually Scarce Scenarios: a Deep Multi-Task Learning Approach (Zaremoodi & Haffari, NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1123.pdf