INSET: Sentence Infilling with INter-SEntential Transformer

Yichen Huang, Yizhe Zhang, Oussama Elachqar, Yu Cheng


Abstract
Missing sentence generation (or sentence in-filling) fosters a wide range of applications in natural language generation, such as document auto-completion and meeting note expansion. This task asks the model to generate intermediate missing sentences that can syntactically and semantically bridge the surrounding context. Solving the sentence infilling task requires techniques in natural language processing ranging from understanding to discourse-level planning to generation. In this paper, we propose a framework to decouple the challenge and address these three aspects respectively, leveraging the power of existing large-scale pre-trained models such as BERT and GPT-2. We empirically demonstrate the effectiveness of our model in learning a sentence representation for generation and further generating a missing sentence that fits the context.
Anthology ID:
2020.acl-main.226
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2502–2515
Language:
URL:
https://aclanthology.org/2020.acl-main.226
DOI:
10.18653/v1/2020.acl-main.226
Bibkey:
Cite (ACL):
Yichen Huang, Yizhe Zhang, Oussama Elachqar, and Yu Cheng. 2020. INSET: Sentence Infilling with INter-SEntential Transformer. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 2502–2515, Online. Association for Computational Linguistics.
Cite (Informal):
INSET: Sentence Infilling with INter-SEntential Transformer (Huang et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.226.pdf
Video:
 http://slideslive.com/38929392
Code
 dreasysnail/INSET