Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization

Yue Cao, Hui Liu, Xiaojun Wan


Abstract
Cross-lingual summarization is the task of generating a summary in one language given a text in a different language. Previous works on cross-lingual summarization mainly focus on using pipeline methods or training an end-to-end model using the translated parallel data. However, it is a big challenge for the model to directly learn cross-lingual summarization as it requires learning to understand different languages and learning how to summarize at the same time. In this paper, we propose to ease the cross-lingual summarization training by jointly learning to align and summarize. We design relevant loss functions to train this framework and propose several methods to enhance the isomorphism and cross-lingual transfer between languages. Experimental results show that our model can outperform competitive models in most cases. In addition, we show that our model even has the ability to generate cross-lingual summaries without access to any cross-lingual corpus.
Anthology ID:
2020.acl-main.554
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6220–6231
Language:
URL:
https://aclanthology.org/2020.acl-main.554
DOI:
10.18653/v1/2020.acl-main.554
Bibkey:
Cite (ACL):
Yue Cao, Hui Liu, and Xiaojun Wan. 2020. Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6220–6231, Online. Association for Computational Linguistics.
Cite (Informal):
Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization (Cao et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.554.pdf
Video:
 http://slideslive.com/38928847