Automatic Poetry Generation with Mutual Reinforcement Learning

Xiaoyuan Yi, Maosong Sun, Ruoyu Li, Wenhao Li


Abstract
Poetry is one of the most beautiful forms of human language art. As a crucial step towards computer creativity, automatic poetry generation has drawn researchers’ attention for decades. In recent years, some neural models have made remarkable progress in this task. However, they are all based on maximum likelihood estimation, which only learns common patterns of the corpus and results in loss-evaluation mismatch. Human experts evaluate poetry in terms of some specific criteria, instead of word-level likelihood. To handle this problem, we directly model the criteria and use them as explicit rewards to guide gradient update by reinforcement learning, so as to motivate the model to pursue higher scores. Besides, inspired by writing theories, we propose a novel mutual reinforcement learning schema. We simultaneously train two learners (generators) which learn not only from the teacher (rewarder) but also from each other to further improve performance. We experiment on Chinese poetry. Based on a strong basic model, our method achieves better results and outperforms the current state-of-the-art method.
Anthology ID:
D18-1353
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3143–3153
Language:
URL:
https://aclanthology.org/D18-1353
DOI:
10.18653/v1/D18-1353
Bibkey:
Cite (ACL):
Xiaoyuan Yi, Maosong Sun, Ruoyu Li, and Wenhao Li. 2018. Automatic Poetry Generation with Mutual Reinforcement Learning. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3143–3153, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Automatic Poetry Generation with Mutual Reinforcement Learning (Yi et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1353.pdf
Video:
 https://aclanthology.org/D18-1353.mp4