A Deep Neural Network Sentence Level Classification Method with Context Information

Xingyi Song, Johann Petrak, Angus Roberts


Abstract
In the sentence classification task, context formed from sentences adjacent to the sentence being classified can provide important information for classification. This context is, however, often ignored. Where methods do make use of context, only small amounts are considered, making it difficult to scale. We present a new method for sentence classification, Context-LSTM-CNN, that makes use of potentially large contexts. The method also utilizes long-range dependencies within the sentence being classified, using an LSTM, and short-span features, using a stacked CNN. Our experiments demonstrate that this approach consistently improves over previous methods on two different datasets.
Anthology ID:
D18-1107
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
900–904
Language:
URL:
https://aclanthology.org/D18-1107
DOI:
10.18653/v1/D18-1107
Bibkey:
Cite (ACL):
Xingyi Song, Johann Petrak, and Angus Roberts. 2018. A Deep Neural Network Sentence Level Classification Method with Context Information. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 900–904, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Deep Neural Network Sentence Level Classification Method with Context Information (Song et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1107.pdf
Video:
 https://aclanthology.org/D18-1107.mp4
Data
IEMOCAP