Multilingual Hierarchical Attention Networks for Document Classification

Nikolaos Pappas, Andrei Popescu-Belis


Abstract
Hierarchical attention networks have recently achieved remarkable performance for document classification in a given language. However, when multilingual document collections are considered, training such models separately for each language entails linear parameter growth and lack of cross-language transfer. Learning a single multilingual model with fewer parameters is therefore a challenging but potentially beneficial objective. To this end, we propose multilingual hierarchical attention networks for learning document structures, with shared encoders and/or shared attention mechanisms across languages, using multi-task learning and an aligned semantic space as input. We evaluate the proposed models on multilingual document classification with disjoint label sets, on a large dataset which we provide, with 600k news documents in 8 languages, and 5k labels. The multilingual models outperform monolingual ones in low-resource as well as full-resource settings, and use fewer parameters, thus confirming their computational efficiency and the utility of cross-language transfer.
Anthology ID:
I17-1102
Volume:
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
November
Year:
2017
Address:
Taipei, Taiwan
Editors:
Greg Kondrak, Taro Watanabe
Venue:
IJCNLP
SIG:
Publisher:
Asian Federation of Natural Language Processing
Note:
Pages:
1015–1025
Language:
URL:
https://aclanthology.org/I17-1102
DOI:
Bibkey:
Cite (ACL):
Nikolaos Pappas and Andrei Popescu-Belis. 2017. Multilingual Hierarchical Attention Networks for Document Classification. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 1015–1025, Taipei, Taiwan. Asian Federation of Natural Language Processing.
Cite (Informal):
Multilingual Hierarchical Attention Networks for Document Classification (Pappas & Popescu-Belis, IJCNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/I17-1102.pdf
Code
 idiap/mhan +  additional community code