Vocabulary Adaptation for Domain Adaptation in Neural Machine Translation

Shoetsu Sato, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda, Masaru Kitsuregawa


Abstract
Neural network methods exhibit strong performance only in a few resource-rich domains. Practitioners therefore employ domain adaptation from resource-rich domains that are, in most cases, distant from the target domain. Domain adaptation between distant domains (e.g., movie subtitles and research papers), however, cannot be performed effectively due to mismatches in vocabulary; it will encounter many domain-specific words (e.g., “angstrom”) and words whose meanings shift across domains (e.g., “conductor”). In this study, aiming to solve these vocabulary mismatches in domain adaptation for neural machine translation (NMT), we propose vocabulary adaptation, a simple method for effective fine-tuning that adapts embedding layers in a given pretrained NMT model to the target domain. Prior to fine-tuning, our method replaces the embedding layers of the NMT model by projecting general word embeddings induced from monolingual data in a target domain onto a source-domain embedding space. Experimental results indicate that our method improves the performance of conventional fine-tuning by 3.86 and 3.28 BLEU points in En-Ja and De-En translation, respectively.
Anthology ID:
2020.findings-emnlp.381
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4269–4279
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.381
DOI:
10.18653/v1/2020.findings-emnlp.381
Bibkey:
Cite (ACL):
Shoetsu Sato, Jin Sakuma, Naoki Yoshinaga, Masashi Toyoda, and Masaru Kitsuregawa. 2020. Vocabulary Adaptation for Domain Adaptation in Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4269–4279, Online. Association for Computational Linguistics.
Cite (Informal):
Vocabulary Adaptation for Domain Adaptation in Neural Machine Translation (Sato et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.381.pdf
Code
 jack-and-rozz/vocabulary_adaptation
Data
ASPECJESC