Neural-DINF: A Neural Network based Framework for Measuring Document Influence

Jie Tan, Changlin Yang, Ying Li, Siliang Tang, Chen Huang, Yueting Zhuang


Abstract
Measuring the scholarly impact of a document without citations is an important and challenging problem. Existing approaches such as Document Influence Model (DIM) are based on dynamic topic models, which only consider the word frequency change. In this paper, we use both frequency changes and word semantic shifts to measure document influence by developing a neural network framework. Our model has three steps. Firstly, we train the word embeddings for different time periods. Subsequently, we propose an unsupervised method to align vectors for different time periods. Finally, we compute the influence value of documents. Our experimental results show that our model outperforms DIM.
Anthology ID:
2020.acl-main.534
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6004–6009
Language:
URL:
https://aclanthology.org/2020.acl-main.534
DOI:
10.18653/v1/2020.acl-main.534
Bibkey:
Cite (ACL):
Jie Tan, Changlin Yang, Ying Li, Siliang Tang, Chen Huang, and Yueting Zhuang. 2020. Neural-DINF: A Neural Network based Framework for Measuring Document Influence. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6004–6009, Online. Association for Computational Linguistics.
Cite (Informal):
Neural-DINF: A Neural Network based Framework for Measuring Document Influence (Tan et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.534.pdf
Video:
 http://slideslive.com/38929332