A Hierarchical Neural Attention-based Text Classifier

Koustuv Sinha, Yue Dong, Jackie Chi Kit Cheung, Derek Ruths


Abstract
Deep neural networks have been displaying superior performance over traditional supervised classifiers in text classification. They learn to extract useful features automatically when sufficient amount of data is presented. However, along with the growth in the number of documents comes the increase in the number of categories, which often results in poor performance of the multiclass classifiers. In this work, we use external knowledge in the form of topic category taxonomies to aide the classification by introducing a deep hierarchical neural attention-based classifier. Our model performs better than or comparable to state-of-the-art hierarchical models at significantly lower computational cost while maintaining high interpretability.
Anthology ID:
D18-1094
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
817–823
Language:
URL:
https://aclanthology.org/D18-1094
DOI:
10.18653/v1/D18-1094
Bibkey:
Cite (ACL):
Koustuv Sinha, Yue Dong, Jackie Chi Kit Cheung, and Derek Ruths. 2018. A Hierarchical Neural Attention-based Text Classifier. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 817–823, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Hierarchical Neural Attention-based Text Classifier (Sinha et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1094.pdf
Attachment:
 D18-1094.Attachment.zip
Code
 koustuvsinha/hier-class
Data
WOS