Exploring Different Dimensions of Attention for Uncertainty Detection

Heike Adel, Hinrich Schütze


Abstract
Neural networks with attention have proven effective for many natural language processing tasks. In this paper, we develop attention mechanisms for uncertainty detection. In particular, we generalize standardly used attention mechanisms by introducing external attention and sequence-preserving attention. These novel architectures differ from standard approaches in that they use external resources to compute attention weights and preserve sequence information. We compare them to other configurations along different dimensions of attention. Our novel architectures set the new state of the art on a Wikipedia benchmark dataset and perform similar to the state-of-the-art model on a biomedical benchmark which uses a large set of linguistic features.
Anthology ID:
E17-1003
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Mirella Lapata, Phil Blunsom, Alexander Koller
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22–34
Language:
URL:
https://aclanthology.org/E17-1003
DOI:
Bibkey:
Cite (ACL):
Heike Adel and Hinrich Schütze. 2017. Exploring Different Dimensions of Attention for Uncertainty Detection. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 1, Long Papers, pages 22–34, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
Exploring Different Dimensions of Attention for Uncertainty Detection (Adel & Schütze, EACL 2017)
Copy Citation:
PDF:
https://aclanthology.org/E17-1003.pdf
Data
SSTSST-2