An Operation Network for Abstractive Sentence Compression

Naitong Yu, Jie Zhang, Minlie Huang, Xiaoyan Zhu


Abstract
Sentence compression condenses a sentence while preserving its most important contents. Delete-based models have the strong ability to delete undesired words, while generate-based models are able to reorder or rephrase the words, which are more coherent to human sentence compression. In this paper, we propose Operation Network, a neural network approach for abstractive sentence compression, which combines the advantages of both delete-based and generate-based sentence compression models. The central idea of Operation Network is to model the sentence compression process as an editing procedure. First, unnecessary words are deleted from the source sentence, then new words are either generated from a large vocabulary or copied directly from the source sentence. A compressed sentence can be obtained by a series of such edit operations (delete, copy and generate). Experiments show that Operation Network outperforms state-of-the-art baselines.
Anthology ID:
C18-1091
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1065–1076
Language:
URL:
https://aclanthology.org/C18-1091
DOI:
Bibkey:
Cite (ACL):
Naitong Yu, Jie Zhang, Minlie Huang, and Xiaoyan Zhu. 2018. An Operation Network for Abstractive Sentence Compression. In Proceedings of the 27th International Conference on Computational Linguistics, pages 1065–1076, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
An Operation Network for Abstractive Sentence Compression (Yu et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1091.pdf
Data
Sentence Compression