Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA

Ieva Staliūnaitė, Ignacio Iacobacci


Abstract
Many NLP tasks have benefited from transferring knowledge from contextualized word embeddings, however the picture of what type of knowledge is transferred is incomplete. This paper studies the types of linguistic phenomena accounted for by language models in the context of a Conversational Question Answering (CoQA) task. We identify the problematic areas for the finetuned RoBERTa, BERT and DistilBERT models through systematic error analysis - basic arithmetic (counting phrases), compositional semantics (negation and Semantic Role Labeling), and lexical semantics (surprisal and antonymy). When enhanced with the relevant linguistic knowledge through multitask learning, the models improve in performance. Ensembles of the enhanced models yield a boost between 2.2 and 2.7 points in F1 score overall, and up to 42.1 points in F1 on the hardest question classes. The results show differences in ability to represent compositional and lexical information between RoBERTa, BERT and DistilBERT.
Anthology ID:
2020.emnlp-main.573
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7046–7056
Language:
URL:
https://aclanthology.org/2020.emnlp-main.573
DOI:
10.18653/v1/2020.emnlp-main.573
Bibkey:
Cite (ACL):
Ieva Staliūnaitė and Ignacio Iacobacci. 2020. Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 7046–7056, Online. Association for Computational Linguistics.
Cite (Informal):
Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA (Staliūnaitė & Iacobacci, EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.573.pdf
Video:
 https://slideslive.com/38939019
Data
CoQA