Fine-grained Entity Typing through Increased Discourse Context and Adaptive Classification Thresholds

Sheng Zhang, Kevin Duh, Benjamin Van Durme


Abstract
Fine-grained entity typing is the task of assigning fine-grained semantic types to entity mentions. We propose a neural architecture which learns a distributional semantic representation that leverages a greater amount of semantic context – both document and sentence level information – than prior work. We find that additional context improves performance, with further improvements gained by utilizing adaptive classification thresholds. Experiments show that our approach without reliance on hand-crafted features achieves the state-of-the-art results on three benchmark datasets.
Anthology ID:
S18-2022
Volume:
Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Malvina Nissim, Jonathan Berant, Alessandro Lenci
Venue:
*SEM
SIGs:
SIGLEX | SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
173–179
Language:
URL:
https://aclanthology.org/S18-2022
DOI:
10.18653/v1/S18-2022
Bibkey:
Cite (ACL):
Sheng Zhang, Kevin Duh, and Benjamin Van Durme. 2018. Fine-grained Entity Typing through Increased Discourse Context and Adaptive Classification Thresholds. In Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics, pages 173–179, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Fine-grained Entity Typing through Increased Discourse Context and Adaptive Classification Thresholds (Zhang et al., *SEM 2018)
Copy Citation:
PDF:
https://aclanthology.org/S18-2022.pdf
Code
 sheng-z/figet
Data
FIGER