Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders

Yu Duan, Canwen Xu, Jiaxin Pei, Jialong Han, Chenliang Li


Abstract
Conditional Text Generation has drawn much attention as a topic of Natural Language Generation (NLG) which provides the possibility for humans to control the properties of generated contents. Current conditional generation models cannot handle emerging conditions due to their joint end-to-end learning fashion. When a new condition added, these techniques require full retraining. In this paper, we present a new framework named Pre-train and Plug-in Variational Auto-Encoder (PPVAE) towards flexible conditional text generation. PPVAE decouples the text generation module from the condition representation module to allow “one-to-many” conditional generation. When a fresh condition emerges, only a lightweight network needs to be trained and works as a plug-in for PPVAE, which is efficient and desirable for real-world applications. Extensive experiments demonstrate the superiority of PPVAE against the existing alternatives with better conditionality and diversity but less training effort.
Anthology ID:
2020.acl-main.23
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
253–262
Language:
URL:
https://aclanthology.org/2020.acl-main.23
DOI:
10.18653/v1/2020.acl-main.23
Bibkey:
Cite (ACL):
Yu Duan, Canwen Xu, Jiaxin Pei, Jialong Han, and Chenliang Li. 2020. Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 253–262, Online. Association for Computational Linguistics.
Cite (Informal):
Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders (Duan et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.23.pdf
Video:
 http://slideslive.com/38928857
Code
 WHUIR/PPVAE