Document-level re-ranking with soft lexical and semantic features for statistical machine translation

Chenchen Ding, Masao Utiyama, Eiichiro Sumita


Abstract
We introduce two document-level features to polish baseline sentence-level translations generated by a state-of-the-art statistical machine translation (SMT) system. One feature uses the word-embedding technique to model the relation between a sentence and its context on the target side; the other feature is a crisp document-level token-type ratio of target-side translations for source-side words to model the lexical consistency in translation. The weights of introduced features are tuned to optimize the sentence- and document-level metrics simultaneously on the basis of Pareto optimality. Experimental results on two different schemes with different corpora illustrate that the proposed approach can efficiently and stably integrate document-level information into a sentence-level SMT system. The best improvements were approximately 0.5 BLEU on test sets with statistical significance.
Anthology ID:
2014.amta-researchers.9
Volume:
Proceedings of the 11th Conference of the Association for Machine Translation in the Americas: MT Researchers Track
Month:
October 22-26
Year:
2014
Address:
Vancouver, Canada
Editors:
Yaser Al-Onaizan, Michel Simard
Venue:
AMTA
SIG:
Publisher:
Association for Machine Translation in the Americas
Note:
Pages:
110–123
Language:
URL:
https://aclanthology.org/2014.amta-researchers.9
DOI:
Bibkey:
Cite (ACL):
Chenchen Ding, Masao Utiyama, and Eiichiro Sumita. 2014. Document-level re-ranking with soft lexical and semantic features for statistical machine translation. In Proceedings of the 11th Conference of the Association for Machine Translation in the Americas: MT Researchers Track, pages 110–123, Vancouver, Canada. Association for Machine Translation in the Americas.
Cite (Informal):
Document-level re-ranking with soft lexical and semantic features for statistical machine translation (Ding et al., AMTA 2014)
Copy Citation:
PDF:
https://aclanthology.org/2014.amta-researchers.9.pdf