Scientific Keyphrase Identification and Classification by Pre-Trained Language Models Intermediate Task Transfer Learning

Seoyeon Park, Cornelia Caragea


Abstract
Scientific keyphrase identification and classification is the task of detecting and classifying keyphrases from scholarly text with their types from a set of predefined classes. This task has a wide range of benefits, but it is still challenging in performance due to the lack of large amounts of labeled data required for training deep neural models. In order to overcome this challenge, we explore pre-trained language models BERT and SciBERT with intermediate task transfer learning, using 42 data-rich related intermediate-target task combinations. We reveal that intermediate task transfer learning on SciBERT induces a better starting point for target task fine-tuning compared with BERT and achieves competitive performance in scientific keyphrase identification and classification compared to both previous works and strong baselines. Interestingly, we observe that BERT with intermediate task transfer learning fails to improve the performance of scientific keyphrase identification and classification potentially due to significant catastrophic forgetting. This result highlights that scientific knowledge achieved during the pre-training of language models on large scientific collections plays an important role in the target tasks. We also observe that sequence tagging related intermediate tasks, especially syntactic structure learning tasks such as POS Tagging, tend to work best for scientific keyphrase identification and classification.
Anthology ID:
2020.coling-main.472
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5409–5419
Language:
URL:
https://aclanthology.org/2020.coling-main.472
DOI:
10.18653/v1/2020.coling-main.472
Bibkey:
Cite (ACL):
Seoyeon Park and Cornelia Caragea. 2020. Scientific Keyphrase Identification and Classification by Pre-Trained Language Models Intermediate Task Transfer Learning. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5409–5419, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Scientific Keyphrase Identification and Classification by Pre-Trained Language Models Intermediate Task Transfer Learning (Park & Caragea, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.472.pdf
Data
MultiNLISemEval-2017 Task-10