Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach

Wenyu Du, Zhouhan Lin, Yikang Shen, Timothy J. O’Donnell, Yoshua Bengio, Yue Zhang


Abstract
It is commonly believed that knowledge of syntactic structure should improve language modeling. However, effectively and computationally efficiently incorporating syntactic structure into neural language models has been a challenging topic. In this paper, we make use of a multi-task objective, i.e., the models simultaneously predict words as well as ground truth parse trees in a form called “syntactic distances”, where information between these two separate objectives shares the same intermediate representation. Experimental results on the Penn Treebank and Chinese Treebank datasets show that when ground truth parse trees are provided as additional training signals, the model is able to achieve lower perplexity and induce trees with better quality.
Anthology ID:
2020.acl-main.591
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6611–6628
Language:
URL:
https://aclanthology.org/2020.acl-main.591
DOI:
10.18653/v1/2020.acl-main.591
Bibkey:
Cite (ACL):
Wenyu Du, Zhouhan Lin, Yikang Shen, Timothy J. O’Donnell, Yoshua Bengio, and Yue Zhang. 2020. Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6611–6628, Online. Association for Computational Linguistics.
Cite (Informal):
Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach (Du et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.591.pdf
Video:
 http://slideslive.com/38929039
Code
 wenyudu/SDLM
Data
Penn Treebank