Dual Supervised Learning for Natural Language Understanding and Generation

Shang-Yu Su, Chao-Wei Huang, Yun-Nung Chen


Abstract
Natural language understanding (NLU) and natural language generation (NLG) are both critical research topics in the NLP and dialogue fields. Natural language understanding is to extract the core semantic meaning from the given utterances, while natural language generation is opposite, of which the goal is to construct corresponding sentences based on the given semantics. However, such dual relationship has not been investigated in literature. This paper proposes a novel learning framework for natural language understanding and generation on top of dual supervised learning, providing a way to exploit the duality. The preliminary experiments show that the proposed approach boosts the performance for both tasks, demonstrating the effectiveness of the dual relationship.
Anthology ID:
P19-1545
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5472–5477
Language:
URL:
https://aclanthology.org/P19-1545
DOI:
10.18653/v1/P19-1545
Bibkey:
Cite (ACL):
Shang-Yu Su, Chao-Wei Huang, and Yun-Nung Chen. 2019. Dual Supervised Learning for Natural Language Understanding and Generation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5472–5477, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Dual Supervised Learning for Natural Language Understanding and Generation (Su et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1545.pdf
Code
 MiuLab/DuaLUG +  additional community code