Zero-Shot Crosslingual Sentence Simplification

Jonathan Mallinson, Rico Sennrich, Mirella Lapata


Abstract
Sentence simplification aims to make sentences easier to read and understand. Recent approaches have shown promising results with encoder-decoder models trained on large amounts of parallel data which often only exists in English. We propose a zero-shot modeling framework which transfers simplification knowledge from English to another language (for which no parallel simplification corpus exists) while generalizing across languages and tasks. A shared transformer encoder constructs language-agnostic representations, with a combination of task-specific encoder layers added on top (e.g., for translation and simplification). Empirical results using both human and automatic metrics show that our approach produces better simplifications than unsupervised and pivot-based methods.
Anthology ID:
2020.emnlp-main.415
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5109–5126
Language:
URL:
https://aclanthology.org/2020.emnlp-main.415
DOI:
10.18653/v1/2020.emnlp-main.415
Bibkey:
Cite (ACL):
Jonathan Mallinson, Rico Sennrich, and Mirella Lapata. 2020. Zero-Shot Crosslingual Sentence Simplification. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 5109–5126, Online. Association for Computational Linguistics.
Cite (Informal):
Zero-Shot Crosslingual Sentence Simplification (Mallinson et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.415.pdf
Video:
 https://slideslive.com/38938888