Neural Text Style Transfer via Denoising and Reranking

Joseph Lee, Ziang Xie, Cindy Wang, Max Drach, Dan Jurafsky, Andrew Ng


Abstract
We introduce a simple method for text style transfer that frames style transfer as denoising: we synthesize a noisy corpus and treat the source style as a noisy version of the target style. To control for aspects such as preserving meaning while modifying style, we propose a reranking approach in the data synthesis phase. We evaluate our method on three novel style transfer tasks: transferring between British and American varieties, text genres (formal vs. casual), and lyrics from different musical genres. By measuring style transfer quality, meaning preservation, and the fluency of generated outputs, we demonstrate that our method is able both to produce high-quality output while maintaining the flexibility to suggest syntactically rich stylistic edits.
Anthology ID:
W19-2309
Volume:
Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Antoine Bosselut, Asli Celikyilmaz, Marjan Ghazvininejad, Srinivasan Iyer, Urvashi Khandelwal, Hannah Rashkin, Thomas Wolf
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
74–81
Language:
URL:
https://aclanthology.org/W19-2309
DOI:
10.18653/v1/W19-2309
Bibkey:
Cite (ACL):
Joseph Lee, Ziang Xie, Cindy Wang, Max Drach, Dan Jurafsky, and Andrew Ng. 2019. Neural Text Style Transfer via Denoising and Reranking. In Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation, pages 74–81, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Neural Text Style Transfer via Denoising and Reranking (Lee et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-2309.pdf