Calibration of Pre-trained Transformers

Shrey Desai, Greg Durrett


Abstract
Pre-trained Transformers are now ubiquitous in natural language processing, but despite their high end-task performance, little is known empirically about whether they are calibrated. Specifically, do these models’ posterior probabilities provide an accurate empirical measure of how likely the model is to be correct on a given example? We focus on BERT and RoBERTa in this work, and analyze their calibration across three tasks: natural language inference, paraphrase detection, and commonsense reasoning. For each task, we consider in-domain as well as challenging out-of-domain settings, where models face more examples they should be uncertain about. We show that: (1) when used out-of-the-box, pre-trained models are calibrated in-domain, and compared to baselines, their calibration error out-of-domain can be as much as 3.5x lower; (2) temperature scaling is effective at further reducing calibration error in-domain, and using label smoothing to deliberately increase empirical uncertainty helps calibrate posteriors out-of-domain.
Anthology ID:
2020.emnlp-main.21
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
295–302
Language:
URL:
https://aclanthology.org/2020.emnlp-main.21
DOI:
10.18653/v1/2020.emnlp-main.21
Bibkey:
Cite (ACL):
Shrey Desai and Greg Durrett. 2020. Calibration of Pre-trained Transformers. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 295–302, Online. Association for Computational Linguistics.
Cite (Informal):
Calibration of Pre-trained Transformers (Desai & Durrett, EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.21.pdf
Video:
 https://slideslive.com/38939157
Code
 shreydesai/calibration
Data
HellaSwagMultiNLISNLISWAG