Sentence Simplification with Memory-Augmented Neural Networks

Tu Vu, Baotian Hu, Tsendsuren Munkhdalai, Hong Yu


Abstract
Sentence simplification aims to simplify the content and structure of complex sentences, and thus make them easier to interpret for human readers, and easier to process for downstream NLP applications. Recent advances in neural machine translation have paved the way for novel approaches to the task. In this paper, we adapt an architecture with augmented memory capacities called Neural Semantic Encoders (Munkhdalai and Yu, 2017) for sentence simplification. Our experiments demonstrate the effectiveness of our approach on different simplification datasets, both in terms of automatic evaluation measures and human judgments.
Anthology ID:
N18-2013
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
79–85
Language:
URL:
https://aclanthology.org/N18-2013
DOI:
10.18653/v1/N18-2013
Bibkey:
Cite (ACL):
Tu Vu, Baotian Hu, Tsendsuren Munkhdalai, and Hong Yu. 2018. Sentence Simplification with Memory-Augmented Neural Networks. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 79–85, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Sentence Simplification with Memory-Augmented Neural Networks (Vu et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-2013.pdf
Data
NewselaTurkCorpusWikiLarge