Can Syntax Help? Improving an LSTM-based Sentence Compression Model for New Domains

Liangguo Wang, Jing Jiang, Hai Leong Chieu, Chen Hui Ong, Dandan Song, Lejian Liao


Abstract
In this paper, we study how to improve the domain adaptability of a deletion-based Long Short-Term Memory (LSTM) neural network model for sentence compression. We hypothesize that syntactic information helps in making such models more robust across domains. We propose two major changes to the model: using explicit syntactic features and introducing syntactic constraints through Integer Linear Programming (ILP). Our evaluation shows that the proposed model works better than the original model as well as a traditional non-neural-network-based model in a cross-domain setting.
Anthology ID:
P17-1127
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1385–1393
Language:
URL:
https://aclanthology.org/P17-1127
DOI:
10.18653/v1/P17-1127
Bibkey:
Cite (ACL):
Liangguo Wang, Jing Jiang, Hai Leong Chieu, Chen Hui Ong, Dandan Song, and Lejian Liao. 2017. Can Syntax Help? Improving an LSTM-based Sentence Compression Model for New Domains. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1385–1393, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Can Syntax Help? Improving an LSTM-based Sentence Compression Model for New Domains (Wang et al., ACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/P17-1127.pdf
Data
GoogleSentence Compression