Understanding Unnatural Questions Improves Reasoning over Text

Xiaoyu Guo, Yuan-Fang Li, Gholamreza Haffari


Abstract
Complex question answering (CQA) over raw text is a challenging task. A prominent approach to this task is based on the programmer-interpreter framework, where the programmer maps the question into a sequence of reasoning actions and the interpreter then executes these actions on the raw text. Learning an effective CQA model requires large amounts of human-annotated data, consisting of the ground-truth sequence of reasoning actions, which is time-consuming and expensive to collect at scale. In this paper, we address the challenge of learning a high-quality programmer (parser) by projecting natural human-generated questions into unnatural machine-generated questions which are more convenient to parse. We firstly generate synthetic (question, action sequence) pairs by a data generator, and train a semantic parser that associates synthetic questions with their corresponding action sequences. To capture the diversity when applied to natural questions, we learn a projection model to map natural questions into their most similar unnatural questions for which the parser can work well. Without any natural training data, our projection model provides high-quality action sequences for the CQA task. Experimental results show that the QA model trained exclusively with synthetic data outperforms its state-of-the-art counterpart trained on human-labeled data.
Anthology ID:
2020.coling-main.434
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4949–4955
Language:
URL:
https://aclanthology.org/2020.coling-main.434
DOI:
10.18653/v1/2020.coling-main.434
Bibkey:
Cite (ACL):
Xiaoyu Guo, Yuan-Fang Li, and Gholamreza Haffari. 2020. Understanding Unnatural Questions Improves Reasoning over Text. In Proceedings of the 28th International Conference on Computational Linguistics, pages 4949–4955, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Understanding Unnatural Questions Improves Reasoning over Text (Guo et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.434.pdf
Data
DROP