Constructing Taxonomies from Pretrained Language Models

Catherine Chen, Kevin Lin, Dan Klein


Abstract
We present a method for constructing taxonomic trees (e.g., WordNet) using pretrained language models. Our approach is composed of two modules, one that predicts parenthood relations and another that reconciles those pairwise predictions into trees. The parenthood prediction module produces likelihood scores for each potential parent-child pair, creating a graph of parent-child relation scores. The tree reconciliation module treats the task as a graph optimization problem and outputs the maximum spanning tree of this graph. We train our model on subtrees sampled from WordNet, and test on nonoverlapping WordNet subtrees. We show that incorporating web-retrieved glosses can further improve performance. On the task of constructing subtrees of English WordNet, the model achieves 66.7 ancestor F1, a 20.0% relative increase over the previous best published result on this task. In addition, we convert the original English dataset into nine other languages using Open Multilingual WordNet and extend our results across these languages.
Anthology ID:
2021.naacl-main.373
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4687–4700
Language:
URL:
https://aclanthology.org/2021.naacl-main.373
DOI:
10.18653/v1/2021.naacl-main.373
Bibkey:
Cite (ACL):
Catherine Chen, Kevin Lin, and Dan Klein. 2021. Constructing Taxonomies from Pretrained Language Models. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4687–4700, Online. Association for Computational Linguistics.
Cite (Informal):
Constructing Taxonomies from Pretrained Language Models (Chen et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.373.pdf
Video:
 https://aclanthology.org/2021.naacl-main.373.mp4