Improving the Quality Trade-Off for Neural Machine Translation Multi-Domain Adaptation

Eva Hasler, Tobias Domhan, Jonay Trenous, Ke Tran, Bill Byrne, Felix Hieber


Abstract
Building neural machine translation systems to perform well on a specific target domain is a well-studied problem. Optimizing system performance for multiple, diverse target domains however remains a challenge. We study this problem in an adaptation setting where the goal is to preserve the existing system quality while incorporating data for domains that were not the focus of the original translation system. We find that we can improve over the performance trade-off offered by Elastic Weight Consolidation with a relatively simple data mixing strategy. At comparable performance on the new domains, catastrophic forgetting is mitigated significantly on strong WMT baselines. Combining both approaches improves the Pareto frontier on this task.
Anthology ID:
2021.emnlp-main.666
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8470–8477
Language:
URL:
https://aclanthology.org/2021.emnlp-main.666
DOI:
10.18653/v1/2021.emnlp-main.666
Bibkey:
Cite (ACL):
Eva Hasler, Tobias Domhan, Jonay Trenous, Ke Tran, Bill Byrne, and Felix Hieber. 2021. Improving the Quality Trade-Off for Neural Machine Translation Multi-Domain Adaptation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 8470–8477, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving the Quality Trade-Off for Neural Machine Translation Multi-Domain Adaptation (Hasler et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.666.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.666.mp4