Quality Estimation for Language Output Applications

Carolina Scarton, Gustavo Paetzold, Lucia Specia


Abstract
Quality Estimation (QE) of language output applications is a research area that has been attracting significant attention. The goal of QE is to estimate the quality of language output applications without the need of human references. Instead, machine learning algorithms are used to build supervised models based on a few labelled training instances. Such models are able to generalise over unseen data and thus QE is a robust method applicable to scenarios where human input is not available or possible. One such a scenario where QE is particularly appealing is that of Machine Translation, where a score for predicted quality can help decide whether or not a translation is useful (e.g. for post-editing) or reliable (e.g. for gisting). Other potential applications within Natural Language Processing (NLP) include Text Summarisation and Text Simplification. In this tutorial we present the task of QE and its application in NLP, focusing on Machine Translation. We also introduce QuEst++, a toolkit for QE that encompasses feature extraction and machine learning, and propose a practical activity to extend this toolkit in various ways.
Anthology ID:
C16-3004
Volume:
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Tutorial Abstracts
Month:
December
Year:
2016
Address:
Osaka, Japan
Editors:
Marcello Federico, Akiko Aizawa
Venue:
COLING
SIG:
Publisher:
The COLING 2016 Organizing Committee
Note:
Pages:
14–17
Language:
URL:
https://aclanthology.org/C16-3004
DOI:
Bibkey:
Cite (ACL):
Carolina Scarton, Gustavo Paetzold, and Lucia Specia. 2016. Quality Estimation for Language Output Applications. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Tutorial Abstracts, pages 14–17, Osaka, Japan. The COLING 2016 Organizing Committee.
Cite (Informal):
Quality Estimation for Language Output Applications (Scarton et al., COLING 2016)
Copy Citation:
PDF:
https://aclanthology.org/C16-3004.pdf
Data
WMT 2016