ARMAN: Pre-training with Semantically Selecting and Reordering of Sentences for Persian Abstractive Summarization

Alireza Salemi, Emad Kebriaei, Ghazal Neisi Minaei, Azadeh Shakery


Abstract
Abstractive text summarization is one of the areas influenced by the emergence of pre-trained language models. Current pre-training works in abstractive summarization give more points to the summaries with more words in common with the main text and pay less attention to the semantic similarity between generated sentences and the original document. We propose ARMAN, a Transformer-based encoder-decoder model pre-trained with three novel objectives to address this issue. In ARMAN, salient sentences from a document are selected according to a modified semantic score to be masked and form a pseudo summary. To summarize more accurately and similar to human writing patterns, we applied modified sentence reordering. We evaluated our proposed models on six downstream Persian summarization tasks. Experimental results show that our proposed model achieves state-of-the-art performance on all six summarization tasks measured by ROUGE and BERTScore. Our models also outperform prior works in textual entailment, question paraphrasing, and multiple choice question answering. Finally, we established a human evaluation and show that using the semantic score significantly improves summarization results.
Anthology ID:
2021.emnlp-main.741
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9391–9407
Language:
URL:
https://aclanthology.org/2021.emnlp-main.741
DOI:
10.18653/v1/2021.emnlp-main.741
Bibkey:
Cite (ACL):
Alireza Salemi, Emad Kebriaei, Ghazal Neisi Minaei, and Azadeh Shakery. 2021. ARMAN: Pre-training with Semantically Selecting and Reordering of Sentences for Persian Abstractive Summarization. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 9391–9407, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
ARMAN: Pre-training with Semantically Selecting and Reordering of Sentences for Persian Abstractive Summarization (Salemi et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.741.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.741.mp4
Code
 alirezasalemi7/arman
Data
CC100ParsiNLUPerKeypn-summary