Further Investigation into Reference Bias in Monolingual Evaluation of Machine Translation

Qingsong Ma, Yvette Graham, Timothy Baldwin, Qun Liu


Abstract
Monolingual evaluation of Machine Translation (MT) aims to simplify human assessment by requiring assessors to compare the meaning of the MT output with a reference translation, opening up the task to a much larger pool of genuinely qualified evaluators. Monolingual evaluation runs the risk, however, of bias in favour of MT systems that happen to produce translations superficially similar to the reference and, consistent with this intuition, previous investigations have concluded monolingual assessment to be strongly biased in this respect. On re-examination of past analyses, we identify a series of potential analytical errors that force some important questions to be raised about the reliability of past conclusions, however. We subsequently carry out further investigation into reference bias via direct human assessment of MT adequacy via quality controlled crowd-sourcing. Contrary to both intuition and past conclusions, results for show no significant evidence of reference bias in monolingual evaluation of MT.
Anthology ID:
D17-1262
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2476–2485
Language:
URL:
https://aclanthology.org/D17-1262
DOI:
10.18653/v1/D17-1262
Bibkey:
Cite (ACL):
Qingsong Ma, Yvette Graham, Timothy Baldwin, and Qun Liu. 2017. Further Investigation into Reference Bias in Monolingual Evaluation of Machine Translation. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 2476–2485, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Further Investigation into Reference Bias in Monolingual Evaluation of Machine Translation (Ma et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1262.pdf
Code
 qingsongma/percentage-refBias