TAG Parsing with Neural Networks and Vector Representations of Supertags

Jungo Kasai, Bob Frank, Tom McCoy, Owen Rambow, Alexis Nasr


Abstract
We present supertagging-based models for Tree Adjoining Grammar parsing that use neural network architectures and dense vector representation of supertags (elementary trees) to achieve state-of-the-art performance in unlabeled and labeled attachment scores. The shift-reduce parsing model eschews lexical information entirely, and uses only the 1-best supertags to parse a sentence, providing further support for the claim that supertagging is “almost parsing.” We demonstrate that the embedding vector representations the parser induces for supertags possess linguistically interpretable structure, supporting analogies between grammatical structures like those familiar from recent work in distributional semantics. This dense representation of supertags overcomes the drawbacks for statistical models of TAG as compared to CCG parsing, raising the possibility that TAG is a viable alternative for NLP tasks that require the assignment of richer structural descriptions to sentences.
Anthology ID:
D17-1180
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1712–1722
Language:
URL:
https://aclanthology.org/D17-1180
DOI:
10.18653/v1/D17-1180
Bibkey:
Cite (ACL):
Jungo Kasai, Bob Frank, Tom McCoy, Owen Rambow, and Alexis Nasr. 2017. TAG Parsing with Neural Networks and Vector Representations of Supertags. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1712–1722, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
TAG Parsing with Neural Networks and Vector Representations of Supertags (Kasai et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1180.pdf