Identifying inherent disagreement in natural language inference

Xinliang Frederick Zhang, Marie-Catherine de Marneffe


Abstract
Natural language inference (NLI) is the task of determining whether a piece of text is entailed, contradicted by or unrelated to another piece of text. In this paper, we investigate how to tease systematic inferences (i.e., items for which people agree on the NLI label) apart from disagreement items (i.e., items which lead to different annotations), which most prior work has overlooked. To distinguish systematic inferences from disagreement items, we propose Artificial Annotators (AAs) to simulate the uncertainty in the annotation process by capturing the modes in annotations. Results on the CommitmentBank, a corpus of naturally occurring discourses in English, confirm that our approach performs statistically significantly better than all baselines. We further show that AAs learn linguistic patterns and context-dependent reasoning.
Anthology ID:
2021.naacl-main.390
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4908–4915
Language:
URL:
https://aclanthology.org/2021.naacl-main.390
DOI:
10.18653/v1/2021.naacl-main.390
Bibkey:
Cite (ACL):
Xinliang Frederick Zhang and Marie-Catherine de Marneffe. 2021. Identifying inherent disagreement in natural language inference. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4908–4915, Online. Association for Computational Linguistics.
Cite (Informal):
Identifying inherent disagreement in natural language inference (Zhang & de Marneffe, NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.390.pdf
Video:
 https://aclanthology.org/2021.naacl-main.390.mp4