A Stable and Effective Learning Strategy for Trainable Greedy Decoding

Yun Chen, Victor O.K. Li, Kyunghyun Cho, Samuel Bowman


Abstract
Beam search is a widely used approximate search strategy for neural network decoders, and it generally outperforms simple greedy decoding on tasks like machine translation. However, this improvement comes at substantial computational cost. In this paper, we propose a flexible new method that allows us to reap nearly the full benefits of beam search with nearly no additional computational cost. The method revolves around a small neural network actor that is trained to observe and manipulate the hidden state of a previously-trained decoder. To train this actor network, we introduce the use of a pseudo-parallel corpus built using the output of beam search on a base model, ranked by a target quality metric like BLEU. Our method is inspired by earlier work on this problem, but requires no reinforcement learning, and can be trained reliably on a range of models. Experiments on three parallel corpora and three architectures show that the method yields substantial improvements in translation quality and speed over each base system.
Anthology ID:
D18-1035
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
380–390
Language:
URL:
https://aclanthology.org/D18-1035
DOI:
10.18653/v1/D18-1035
Bibkey:
Cite (ACL):
Yun Chen, Victor O.K. Li, Kyunghyun Cho, and Samuel Bowman. 2018. A Stable and Effective Learning Strategy for Trainable Greedy Decoding. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 380–390, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Stable and Effective Learning Strategy for Trainable Greedy Decoding (Chen et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1035.pdf