Dual Inference for Improving Language Understanding and Generation

Shang-Yu Su, Yung-Sung Chuang, Yun-Nung Chen


Abstract
Natural language understanding (NLU) and Natural language generation (NLG) tasks hold a strong dual relationship, where NLU aims at predicting semantic labels based on natural language utterances and NLG does the opposite. The prior work mainly focused on exploiting the duality in model training in order to obtain the models with better performance. However, regarding the fast-growing scale of models in the current NLP area, sometimes we may have difficulty retraining whole NLU and NLG models. To better address the issue, this paper proposes to leverage the duality in the inference stage without the need of retraining. The experiments on three benchmark datasets demonstrate the effectiveness of the proposed method in both NLU and NLG, providing the great potential of practical usage.
Anthology ID:
2020.findings-emnlp.443
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4930–4936
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.443
DOI:
10.18653/v1/2020.findings-emnlp.443
Bibkey:
Cite (ACL):
Shang-Yu Su, Yung-Sung Chuang, and Yun-Nung Chen. 2020. Dual Inference for Improving Language Understanding and Generation. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4930–4936, Online. Association for Computational Linguistics.
Cite (Informal):
Dual Inference for Improving Language Understanding and Generation (Su et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.443.pdf
Code
 MiuLab/DuaLUG
Data
ATISSNIPS