The NiuTrans System for the WMT20 Quality Estimation Shared Task

Chi Hu, Hui Liu, Kai Feng, Chen Xu, Nuo Xu, Zefan Zhou, Shiqin Yan, Yingfeng Luo, Chenglong Wang, Xia Meng, Tong Xiao, Jingbo Zhu


Abstract
This paper describes the submissions of the NiuTrans Team to the WMT 2020 Quality Estimation Shared Task. We participated in all tasks and all language pairs. We explored the combination of transfer learning, multi-task learning and model ensemble. Results on multiple tasks show that deep transformer machine translation models and multilingual pretraining methods significantly improve translation quality estimation performance. Our system achieved remarkable results in multiple level tasks, e.g., our submissions obtained the best results on all tracks in the sentence-level Direct Assessment task.
Anthology ID:
2020.wmt-1.117
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1018–1023
Language:
URL:
https://aclanthology.org/2020.wmt-1.117
DOI:
Bibkey:
Cite (ACL):
Chi Hu, Hui Liu, Kai Feng, Chen Xu, Nuo Xu, Zefan Zhou, Shiqin Yan, Yingfeng Luo, Chenglong Wang, Xia Meng, Tong Xiao, and Jingbo Zhu. 2020. The NiuTrans System for the WMT20 Quality Estimation Shared Task. In Proceedings of the Fifth Conference on Machine Translation, pages 1018–1023, Online. Association for Computational Linguistics.
Cite (Informal):
The NiuTrans System for the WMT20 Quality Estimation Shared Task (Hu et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.117.pdf
Video:
 https://slideslive.com/38939601