Checkpoint Reranking: An Approach to Select Better Hypothesis for Neural Machine Translation Systems

Vinay Pandramish, Dipti Misra Sharma


Abstract
In this paper, we propose a method of re-ranking the outputs of Neural Machine Translation (NMT) systems. After the decoding process, we select a few last iteration outputs in the training process as the N-best list. After training a Neural Machine Translation (NMT) baseline system, it has been observed that these iteration outputs have an oracle score higher than baseline up to 1.01 BLEU points compared to the last iteration of the trained system.We come up with a ranking mechanism by solely focusing on the decoder’s ability to generate distinct tokens and without the usage of any language model or data. With this method, we achieved a translation improvement up to +0.16 BLEU points over baseline.We also evaluate our approach by applying the coverage penalty to the training process.In cases of moderate coverage penalty, the oracle scores are higher than the final iteration up to +0.99 BLEU points, and our algorithm gives an improvement up to +0.17 BLEU points.With excessive penalty, there is a decrease in translation quality compared to the baseline system. Still, an increase in oracle scores up to +1.30 is observed with the re-ranking algorithm giving an improvement up to +0.15 BLEU points is found in case of excessive penalty.The proposed re-ranking method is a generic one and can be extended to other language pairs as well.
Anthology ID:
2020.acl-srw.38
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2020
Address:
Online
Editors:
Shruti Rijhwani, Jiangming Liu, Yizhong Wang, Rotem Dror
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
286–291
Language:
URL:
https://aclanthology.org/2020.acl-srw.38
DOI:
10.18653/v1/2020.acl-srw.38
Bibkey:
Cite (ACL):
Vinay Pandramish and Dipti Misra Sharma. 2020. Checkpoint Reranking: An Approach to Select Better Hypothesis for Neural Machine Translation Systems. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 286–291, Online. Association for Computational Linguistics.
Cite (Informal):
Checkpoint Reranking: An Approach to Select Better Hypothesis for Neural Machine Translation Systems (Pandramish & Sharma, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-srw.38.pdf
Software:
 2020.acl-srw.38.Software.zip
Video:
 http://slideslive.com/38928632