Fine-grained Fact Verification with Kernel Graph Attention Network

Zhenghao Liu, Chenyan Xiong, Maosong Sun, Zhiyuan Liu


Abstract
Fact Verification requires fine-grained natural language inference capability that finds subtle clues to identify the syntactical and semantically correct but not well-supported claims. This paper presents Kernel Graph Attention Network (KGAT), which conducts more fine-grained fact verification with kernel-based attentions. Given a claim and a set of potential evidence sentences that form an evidence graph, KGAT introduces node kernels, which better measure the importance of the evidence node, and edge kernels, which conduct fine-grained evidence propagation in the graph, into Graph Attention Networks for more accurate fact verification. KGAT achieves a 70.38% FEVER score and significantly outperforms existing fact verification models on FEVER, a large-scale benchmark for fact verification. Our analyses illustrate that, compared to dot-product attentions, the kernel-based attention concentrates more on relevant evidence sentences and meaningful clues in the evidence graph, which is the main source of KGAT’s effectiveness. All source codes of this work are available at https://github.com/thunlp/KernelGAT.
Anthology ID:
2020.acl-main.655
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7342–7351
Language:
URL:
https://aclanthology.org/2020.acl-main.655
DOI:
10.18653/v1/2020.acl-main.655
Bibkey:
Cite (ACL):
Zhenghao Liu, Chenyan Xiong, Maosong Sun, and Zhiyuan Liu. 2020. Fine-grained Fact Verification with Kernel Graph Attention Network. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7342–7351, Online. Association for Computational Linguistics.
Cite (Informal):
Fine-grained Fact Verification with Kernel Graph Attention Network (Liu et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.655.pdf
Software:
 2020.acl-main.655.Software.txt
Video:
 http://slideslive.com/38928682
Code
 thunlp/KernelGAT
Data
FEVER