How to Avoid Sentences Spelling Boring? Towards a Neural Approach to Unsupervised Metaphor Generation

Zhiwei Yu, Xiaojun Wan


Abstract
Metaphor generation attempts to replicate human creativity with language, which is an attractive but challengeable text generation task. Previous efforts mainly focus on template-based or rule-based methods and result in a lack of linguistic subtlety. In order to create novel metaphors, we propose a neural approach to metaphor generation and explore the shared inferential structure of a metaphorical usage and a literal usage of a verb. Our approach does not require any manually annotated metaphors for training. We extract the metaphorically used verbs with their metaphorical senses in an unsupervised way and train a neural language model from wiki corpus. Then we generate metaphors conveying the assigned metaphorical senses with an improved decoding algorithm. Automatic metrics and human evaluations demonstrate that our approach can generate metaphors with good readability and creativity.
Anthology ID:
N19-1092
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
861–871
Language:
URL:
https://aclanthology.org/N19-1092
DOI:
10.18653/v1/N19-1092
Bibkey:
Cite (ACL):
Zhiwei Yu and Xiaojun Wan. 2019. How to Avoid Sentences Spelling Boring? Towards a Neural Approach to Unsupervised Metaphor Generation. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 861–871, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
How to Avoid Sentences Spelling Boring? Towards a Neural Approach to Unsupervised Metaphor Generation (Yu & Wan, NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1092.pdf