Parallel Corpus Filtering via Pre-trained Language Models

Boliang Zhang, Ajay Nagesh, Kevin Knight


Abstract
Web-crawled data provides a good source of parallel corpora for training machine translation models. It is automatically obtained, but extremely noisy, and recent work shows that neural machine translation systems are more sensitive to noise than traditional statistical machine translation methods. In this paper, we propose a novel approach to filter out noisy sentence pairs from web-crawled corpora via pre-trained language models. We measure sentence parallelism by leveraging the multilingual capability of BERT and use the Generative Pre-training (GPT) language model as a domain filter to balance data domains. We evaluate the proposed method on the WMT 2018 Parallel Corpus Filtering shared task, and on our own web-crawled Japanese-Chinese parallel corpus. Our method significantly outperforms baselines and achieves a new state-of-the-art. In an unsupervised setting, our method achieves comparable performance to the top-1 supervised method. We also evaluate on a web-crawled Japanese-Chinese parallel corpus that we make publicly available.
Anthology ID:
2020.acl-main.756
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8545–8554
Language:
URL:
https://aclanthology.org/2020.acl-main.756
DOI:
10.18653/v1/2020.acl-main.756
Bibkey:
Cite (ACL):
Boliang Zhang, Ajay Nagesh, and Kevin Knight. 2020. Parallel Corpus Filtering via Pre-trained Language Models. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 8545–8554, Online. Association for Computational Linguistics.
Cite (Informal):
Parallel Corpus Filtering via Pre-trained Language Models (Zhang et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.756.pdf
Video:
 http://slideslive.com/38928720