Inferring symmetry in natural language

Chelsea Tanchip, Lei Yu, Aotao Xu, Yang Xu


Abstract
We present a methodological framework for inferring symmetry of verb predicates in natural language. Empirical work on predicate symmetry has taken two main approaches. The feature-based approach focuses on linguistic features pertaining to symmetry. The context-based approach denies the existence of absolute symmetry but instead argues that such inference is context dependent. We develop methods that formalize these approaches and evaluate them against a novel symmetry inference sentence (SIS) dataset comprised of 400 naturalistic usages of literature-informed verbs spanning the spectrum of symmetry-asymmetry. Our results show that a hybrid transfer learning model that integrates linguistic features with contextualized language models most faithfully predicts the empirical data. Our work integrates existing approaches to symmetry in natural language and suggests how symmetry inference can improve systematicity in state-of-the-art language models.
Anthology ID:
2020.findings-emnlp.259
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2877–2886
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.259
DOI:
10.18653/v1/2020.findings-emnlp.259
Bibkey:
Cite (ACL):
Chelsea Tanchip, Lei Yu, Aotao Xu, and Yang Xu. 2020. Inferring symmetry in natural language. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 2877–2886, Online. Association for Computational Linguistics.
Cite (Informal):
Inferring symmetry in natural language (Tanchip et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.259.pdf
Code
 jadeleiyu/symmetry_inference
Data
SIS