Augmenting Named Entity Recognition with Commonsense Knowledge

Gaith Dekhili, Tan Ngoc Le, Fatiha Sadat


Abstract
Commonsense can be vital in some applications like Natural Language Understanding (NLU), where it is often required to resolve ambiguity arising from implicit knowledge and underspecification. In spite of the remarkable success of neural network approaches on a variety of Natural Language Processing tasks, many of them struggle to react effectively in cases that require commonsense knowledge. In the present research, we take advantage of the availability of the open multilingual knowledge graph ConceptNet, by using it as an additional external resource in Named Entity Recognition (NER). Our proposed architecture involves BiLSTM layers combined with a CRF layer that was augmented with some features such as pre-trained word embedding layers and dropout layers. Moreover, apart from using word representations, we used also character-based representation to capture the morphological and the orthographic information. Our experiments and evaluations showed an improvement in the overall performance with +2.86 in the F1-measure. Commonsense reasonnig has been employed in other studies and NLP tasks but to the best of our knowledge, there is no study relating the integration of a commonsense knowledge base in NER.
Anthology ID:
W19-3644
Volume:
Proceedings of the 2019 Workshop on Widening NLP
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Amittai Axelrod, Diyi Yang, Rossana Cunha, Samira Shaikh, Zeerak Waseem
Venue:
WiNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
142
Language:
URL:
https://aclanthology.org/W19-3644
DOI:
Bibkey:
Cite (ACL):
Gaith Dekhili, Tan Ngoc Le, and Fatiha Sadat. 2019. Augmenting Named Entity Recognition with Commonsense Knowledge. In Proceedings of the 2019 Workshop on Widening NLP, page 142, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Augmenting Named Entity Recognition with Commonsense Knowledge (Dekhili et al., WiNLP 2019)
Copy Citation: