Autoregressive Text Generation Beyond Feedback Loops

Florian Schmidt, Stephan Mandt, Thomas Hofmann


Abstract
Autoregressive state transitions, where predictions are conditioned on past predictions, are the predominant choice for both deterministic and stochastic sequential models. However, autoregressive feedback exposes the evolution of the hidden state trajectory to potential biases from well-known train-test discrepancies. In this paper, we combine a latent state space model with a CRF observation model. We argue that such autoregressive observation models form an interesting middle ground that expresses local correlations on the word level but keeps the state evolution non-autoregressive. On unconditional sentence generation we show performance improvements compared to RNN and GAN baselines while avoiding some prototypical failure modes of autoregressive models.
Anthology ID:
D19-1338
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3400–3406
Language:
URL:
https://aclanthology.org/D19-1338
DOI:
10.18653/v1/D19-1338
Bibkey:
Cite (ACL):
Florian Schmidt, Stephan Mandt, and Thomas Hofmann. 2019. Autoregressive Text Generation Beyond Feedback Loops. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 3400–3406, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Autoregressive Text Generation Beyond Feedback Loops (Schmidt et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1338.pdf
Attachment:
 D19-1338.Attachment.pdf
Code
 schmiflo/crf-generation