A Deep Neural Information Fusion Architecture for Textual Network Embeddings

Zenan Xu, Qinliang Su, Xiaojun Quan, Weijia Zhang


Abstract
Textual network embeddings aim to learn a low-dimensional representation for every node in the network so that both the structural and textual information from the networks can be well preserved in the representations. Traditionally, the structural and textual embeddings were learned by models that rarely take the mutual influences between them into account. In this paper, a deep neural architecture is proposed to effectively fuse the two kinds of informations into one representation. The novelties of the proposed architecture are manifested in the aspects of a newly defined objective function, the complementary information fusion method for structural and textual features, and the mutual gate mechanism for textual feature extraction. Experimental results show that the proposed model outperforms the comparing methods on all three datasets.
Anthology ID:
D19-1476
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4698–4706
Language:
URL:
https://aclanthology.org/D19-1476
DOI:
10.18653/v1/D19-1476
Bibkey:
Cite (ACL):
Zenan Xu, Qinliang Su, Xiaojun Quan, and Weijia Zhang. 2019. A Deep Neural Information Fusion Architecture for Textual Network Embeddings. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 4698–4706, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
A Deep Neural Information Fusion Architecture for Textual Network Embeddings (Xu et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1476.pdf
Data
Cora