Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization

Junnan Zhu, Yu Zhou, Jiajun Zhang, Chengqing Zong


Abstract
Cross-lingual summarization aims at summarizing a document in one language (e.g., Chinese) into another language (e.g., English). In this paper, we propose a novel method inspired by the translation pattern in the process of obtaining a cross-lingual summary. We first attend to some words in the source text, then translate them into the target language, and summarize to get the final summary. Specifically, we first employ the encoder-decoder attention distribution to attend to the source words. Second, we present three strategies to acquire the translation probability, which helps obtain the translation candidates for each source word. Finally, each summary word is generated either from the neural distribution or from the translation candidates of source words. Experimental results on Chinese-to-English and English-to-Chinese summarization tasks have shown that our proposed method can significantly outperform the baselines, achieving comparable performance with the state-of-the-art.
Anthology ID:
2020.acl-main.121
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1309–1321
Language:
URL:
https://aclanthology.org/2020.acl-main.121
DOI:
10.18653/v1/2020.acl-main.121
Bibkey:
Cite (ACL):
Junnan Zhu, Yu Zhou, Jiajun Zhang, and Chengqing Zong. 2020. Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 1309–1321, Online. Association for Computational Linguistics.
Cite (Informal):
Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization (Zhu et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.121.pdf
Video:
 http://slideslive.com/38928726