Joint Training with Semantic Role Labeling for Better Generalization in Natural Language Inference

Cemil Cengiz, Deniz Yuret


Abstract
End-to-end models trained on natural language inference (NLI) datasets show low generalization on out-of-distribution evaluation sets. The models tend to learn shallow heuristics due to dataset biases. The performance decreases dramatically on diagnostic sets measuring compositionality or robustness against simple heuristics. Existing solutions for this problem employ dataset augmentation which has the drawbacks of being applicable to only a limited set of adversaries and at worst hurting the model performance on other adversaries not included in the augmentation set. Instead, our proposed solution is to improve sentence understanding (hence out-of-distribution generalization) with joint learning of explicit semantics. We show that a BERT based model trained jointly on English semantic role labeling (SRL) and NLI achieves significantly higher performance on external evaluation sets measuring generalization performance.
Anthology ID:
2020.repl4nlp-1.11
Volume:
Proceedings of the 5th Workshop on Representation Learning for NLP
Month:
July
Year:
2020
Address:
Online
Editors:
Spandana Gella, Johannes Welbl, Marek Rei, Fabio Petroni, Patrick Lewis, Emma Strubell, Minjoon Seo, Hannaneh Hajishirzi
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
78–88
Language:
URL:
https://aclanthology.org/2020.repl4nlp-1.11
DOI:
10.18653/v1/2020.repl4nlp-1.11
Bibkey:
Cite (ACL):
Cemil Cengiz and Deniz Yuret. 2020. Joint Training with Semantic Role Labeling for Better Generalization in Natural Language Inference. In Proceedings of the 5th Workshop on Representation Learning for NLP, pages 78–88, Online. Association for Computational Linguistics.
Cite (Informal):
Joint Training with Semantic Role Labeling for Better Generalization in Natural Language Inference (Cengiz & Yuret, RepL4NLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.repl4nlp-1.11.pdf
Video:
 http://slideslive.com/38929777