Keep Learning: Self-supervised Meta-learning for Learning from Inference

Akhil Kedia, Sai Chetan Chinthakindi


Abstract
A common approach in many machine learning algorithms involves self-supervised learning on large unlabeled data before fine-tuning on downstream tasks to further improve performance. A new approach for language modelling, called dynamic evaluation, further fine-tunes a trained model during inference using trivially-present ground-truth labels, giving a large improvement in performance. However, this approach does not easily extend to classification tasks, where ground-truth labels are absent during inference. We propose to solve this issue by utilizing self-training and back-propagating the loss from the model’s own class-balanced predictions (pseudo-labels), adapting the Reptile algorithm from meta-learning, combined with an inductive bias towards pre-trained weights to improve generalization. Our method improves the performance of standard backbones such as BERT, Electra, and ResNet-50 on a wide variety of tasks, such as question answering on SQuAD and NewsQA, benchmark task SuperGLUE, conversation response selection on Ubuntu Dialog corpus v2.0, as well as image classification on MNIST and ImageNet without any changes to the underlying models. Our proposed method outperforms previous approaches, enables self-supervised fine-tuning during inference of any classifier model to better adapt to target domains, can be easily adapted to any model, and is also effective in online and transfer-learning settings.
Anthology ID:
2021.eacl-main.6
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
63–77
Language:
URL:
https://aclanthology.org/2021.eacl-main.6
DOI:
10.18653/v1/2021.eacl-main.6
Bibkey:
Cite (ACL):
Akhil Kedia and Sai Chetan Chinthakindi. 2021. Keep Learning: Self-supervised Meta-learning for Learning from Inference. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 63–77, Online. Association for Computational Linguistics.
Cite (Informal):
Keep Learning: Self-supervised Meta-learning for Learning from Inference (Kedia & Chinthakindi, EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.6.pdf
Data
BoolQCOPAImageNetMultiRCNewsQAReCoRDSuperGLUEUDCWiC