Gorynych Transformer at SemEval-2020 Task 6: Multi-task Learning for Definition Extraction

Adis Davletov, Nikolay Arefyev, Alexander Shatilov, Denis Gordeev, Alexey Rey


Abstract
This paper describes our approach to “DeftEval: Extracting Definitions from Free Text in Textbooks” competition held as a part of Semeval 2020. The task was devoted to finding and labeling definitions in texts. DeftEval was split into three subtasks: sentence classification, sequence labeling and relation classification. Our solution ranked 5th in the first subtask and 23rd and 21st in the second and the third subtasks respectively. We applied simultaneous multi-task learning with Transformer-based models for subtasks 1 and 3 and a single BERT-based model for named entity recognition.
Anthology ID:
2020.semeval-1.59
Volume:
Proceedings of the Fourteenth Workshop on Semantic Evaluation
Month:
December
Year:
2020
Address:
Barcelona (online)
Editors:
Aurelie Herbelot, Xiaodan Zhu, Alexis Palmer, Nathan Schneider, Jonathan May, Ekaterina Shutova
Venue:
SemEval
SIG:
SIGLEX
Publisher:
International Committee for Computational Linguistics
Note:
Pages:
487–493
Language:
URL:
https://aclanthology.org/2020.semeval-1.59
DOI:
10.18653/v1/2020.semeval-1.59
Bibkey:
Cite (ACL):
Adis Davletov, Nikolay Arefyev, Alexander Shatilov, Denis Gordeev, and Alexey Rey. 2020. Gorynych Transformer at SemEval-2020 Task 6: Multi-task Learning for Definition Extraction. In Proceedings of the Fourteenth Workshop on Semantic Evaluation, pages 487–493, Barcelona (online). International Committee for Computational Linguistics.
Cite (Informal):
Gorynych Transformer at SemEval-2020 Task 6: Multi-task Learning for Definition Extraction (Davletov et al., SemEval 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.semeval-1.59.pdf