Discourse-Aware Unsupervised Summarization for Long Scientific Documents

Yue Dong, Andrei Mircea, Jackie Chi Kit Cheung


Abstract
We propose an unsupervised graph-based ranking model for extractive summarization of long scientific documents. Our method assumes a two-level hierarchical graph representation of the source document, and exploits asymmetrical positional cues to determine sentence importance. Results on the PubMed and arXiv datasets show that our approach outperforms strong unsupervised baselines by wide margins in automatic metrics and human evaluation. In addition, it achieves performance comparable to many state-of-the-art supervised approaches which are trained on hundreds of thousands of examples. These results suggest that patterns in the discourse structure are a strong signal for determining importance in scientific articles.
Anthology ID:
2021.eacl-main.93
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1089–1102
Language:
URL:
https://aclanthology.org/2021.eacl-main.93
DOI:
10.18653/v1/2021.eacl-main.93
Bibkey:
Cite (ACL):
Yue Dong, Andrei Mircea, and Jackie Chi Kit Cheung. 2021. Discourse-Aware Unsupervised Summarization for Long Scientific Documents. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1089–1102, Online. Association for Computational Linguistics.
Cite (Informal):
Discourse-Aware Unsupervised Summarization for Long Scientific Documents (Dong et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.93.pdf
Code
 mirandrom/HipoRank
Data
CNN/Daily MailNew York Times Annotated Corpus