Non-Monotonic Sequential Text Generation

Kiante Brantley, Kyunghyun Cho, Hal Daumé, Sean Welleck


Abstract
Standard sequential generation methods assume a pre-specified generation order, such as text generation methods which generate words from left to right. In this work, we propose a framework for training models of text generation that operate in non-monotonic orders; the model directly learns good orders, without any additional annotation. Our framework operates by generating a word at an arbitrary position, and then recursively generating words to its left and then words to its right, yielding a binary tree. Learning is framed as imitation learning, including a coaching method which moves from imitating an oracle to reinforcing the policy’s own preferences. Experimental results demonstrate that using the proposed method, it is possible to learn policies which generate text without pre-specifying a generation order while achieving competitive performance with conventional left-to-right generation.
Anthology ID:
W19-3620
Volume:
Proceedings of the 2019 Workshop on Widening NLP
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Amittai Axelrod, Diyi Yang, Rossana Cunha, Samira Shaikh, Zeerak Waseem
Venue:
WiNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
57–59
Language:
URL:
https://aclanthology.org/W19-3620
DOI:
Bibkey:
Cite (ACL):
Kiante Brantley, Kyunghyun Cho, Hal Daumé, and Sean Welleck. 2019. Non-Monotonic Sequential Text Generation. In Proceedings of the 2019 Workshop on Widening NLP, pages 57–59, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Non-Monotonic Sequential Text Generation (Brantley et al., WiNLP 2019)
Copy Citation: