Mixing Context Granularities for Improved Entity Linking on Question Answering Data across Entity Categories

Daniil Sorokin, Iryna Gurevych


Abstract
The first stage of every knowledge base question answering approach is to link entities in the input question. We investigate entity linking in the context of question answering task and present a jointly optimized neural architecture for entity mention detection and entity disambiguation that models the surrounding context on different levels of granularity. We use the Wikidata knowledge base and available question answering datasets to create benchmarks for entity linking on question answering data. Our approach outperforms the previous state-of-the-art system on this data, resulting in an average 8% improvement of the final score. We further demonstrate that our model delivers a strong performance across different entity categories.
Anthology ID:
S18-2007
Volume:
Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Malvina Nissim, Jonathan Berant, Alessandro Lenci
Venue:
*SEM
SIGs:
SIGLEX | SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
65–75
Language:
URL:
https://aclanthology.org/S18-2007
DOI:
10.18653/v1/S18-2007
Bibkey:
Cite (ACL):
Daniil Sorokin and Iryna Gurevych. 2018. Mixing Context Granularities for Improved Entity Linking on Question Answering Data across Entity Categories. In Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics, pages 65–75, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Mixing Context Granularities for Improved Entity Linking on Question Answering Data across Entity Categories (Sorokin & Gurevych, *SEM 2018)
Copy Citation:
PDF:
https://aclanthology.org/S18-2007.pdf
Code
 UKPLab/starsem2018-entity-linking
Data
GraphQuestionsWebQuestionsWebQuestionsSP