Red Dragon AI at TextGraphs 2020 Shared Task : LIT : LSTM-Interleaved Transformer for Multi-Hop Explanation Ranking

Yew Ken Chia, Sam Witteveen, Martin Andrews


Abstract
Explainable question answering for science questions is a challenging task that requires multi-hop inference over a large set of fact sentences. To counter the limitations of methods that view each query-document pair in isolation, we propose the LSTM-Interleaved Transformer which incorporates cross-document interactions for improved multi-hop ranking. The LIT architecture can leverage prior ranking positions in the re-ranking setting. Our model is competitive on the current leaderboard for the TextGraphs 2020 shared task, achieving a test-set MAP of 0.5607, and would have gained third place had we submitted before the competition deadline. Our code implementation is made available at [https://github.com/mdda/worldtree_corpus/tree/textgraphs_2020](https://github.com/mdda/worldtree_corpus/tree/textgraphs_2020).
Anthology ID:
2020.textgraphs-1.14
Volume:
Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs)
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Dmitry Ustalov, Swapna Somasundaran, Alexander Panchenko, Fragkiskos D. Malliaros, Ioana Hulpuș, Peter Jansen, Abhik Jana
Venue:
TextGraphs
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
115–120
Language:
URL:
https://aclanthology.org/2020.textgraphs-1.14
DOI:
10.18653/v1/2020.textgraphs-1.14
Bibkey:
Cite (ACL):
Yew Ken Chia, Sam Witteveen, and Martin Andrews. 2020. Red Dragon AI at TextGraphs 2020 Shared Task : LIT : LSTM-Interleaved Transformer for Multi-Hop Explanation Ranking. In Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs), pages 115–120, Barcelona, Spain (Online). Association for Computational Linguistics.
Cite (Informal):
Red Dragon AI at TextGraphs 2020 Shared Task : LIT : LSTM-Interleaved Transformer for Multi-Hop Explanation Ranking (Chia et al., TextGraphs 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.textgraphs-1.14.pdf
Code
 mdda/worldtree_corpus
Data
Worldtree