Adaptable and Interpretable Neural MemoryOver Symbolic Knowledge

Pat Verga, Haitian Sun, Livio Baldini Soares, William Cohen


Abstract
Past research has demonstrated that large neural language models (LMs) encode surprising amounts of factual information: however, augmenting or modifying this information requires modifying a corpus and retraining, which is computationally expensive. To address this problem, we develop a neural LM that includes an interpretable neuro-symbolic KB in the form of a “fact memory”. Each element of the fact memory is formed from a triple of vectors, where each vector corresponds to a KB entity or relation. Our LM improves performance on knowledge-intensive question-answering tasks, sometimes dramatically, including a 27 point increase in one setting of WebQuestionsSP over a state-of-the-art open-book model, despite using 5% of the parameters. Most interestingly, we demonstrate that the model can be modified, without any re-training, by updating the fact memory.
Anthology ID:
2021.naacl-main.288
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3678–3691
Language:
URL:
https://aclanthology.org/2021.naacl-main.288
DOI:
10.18653/v1/2021.naacl-main.288
Bibkey:
Cite (ACL):
Pat Verga, Haitian Sun, Livio Baldini Soares, and William Cohen. 2021. Adaptable and Interpretable Neural MemoryOver Symbolic Knowledge. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 3678–3691, Online. Association for Computational Linguistics.
Cite (Informal):
Adaptable and Interpretable Neural MemoryOver Symbolic Knowledge (Verga et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.288.pdf
Video:
 https://aclanthology.org/2021.naacl-main.288.mp4
Data
LAMAWebQuestions