Hierarchical Bi-Directional Self-Attention Networks for Paper Review Rating Recommendation

Zhongfen Deng, Hao Peng, Congying Xia, Jianxin Li, Lifang He, Philip Yu


Abstract
Review rating prediction of text reviews is a rapidly growing technology with a wide range of applications in natural language processing. However, most existing methods either use hand-crafted features or learn features using deep learning with simple text corpus as input for review rating prediction, ignoring the hierarchies among data. In this paper, we propose a Hierarchical bi-directional self-attention Network framework (HabNet) for paper review rating prediction and recommendation, which can serve as an effective decision-making tool for the academic paper review process. Specifically, we leverage the hierarchical structure of the paper reviews with three levels of encoders: sentence encoder (level one), intra-review encoder (level two) and inter-review encoder (level three). Each encoder first derives contextual representation of each level, then generates a higher-level representation, and after the learning process, we are able to identify useful predictors to make the final acceptance decision, as well as to help discover the inconsistency between numerical review ratings and text sentiment conveyed by reviewers. Furthermore, we introduce two new metrics to evaluate models in data imbalance situations. Extensive experiments on a publicly available dataset (PeerRead) and our own collected dataset (OpenReview) demonstrate the superiority of the proposed approach compared with state-of-the-art methods.
Anthology ID:
2020.coling-main.555
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6302–6314
Language:
URL:
https://aclanthology.org/2020.coling-main.555
DOI:
10.18653/v1/2020.coling-main.555
Bibkey:
Cite (ACL):
Zhongfen Deng, Hao Peng, Congying Xia, Jianxin Li, Lifang He, and Philip Yu. 2020. Hierarchical Bi-Directional Self-Attention Networks for Paper Review Rating Recommendation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 6302–6314, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Hierarchical Bi-Directional Self-Attention Networks for Paper Review Rating Recommendation (Deng et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.555.pdf
Code
 RingBDStack/HabNet
Data
PeerRead