m_y at SemEval-2019 Task 9: Exploring BERT for Suggestion Mining

Masahiro Yamamoto, Toshiyuki Sekiya


Abstract
This paper presents our system to the SemEval-2019 Task 9, Suggestion Mining from Online Reviews and Forums. The goal of this task is to extract suggestions such as the expressions of tips, advice, and recommendations. We explore Bidirectional Encoder Representations from Transformers (BERT) focusing on target domain pre-training in Subtask A which provides training and test datasets in the same domain. In Subtask B, the cross domain suggestion mining task, we apply the idea of distant supervision. Our system obtained the third place in Subtask A and the fifth place in Subtask B, which demonstrates its efficacy of our approaches.
Anthology ID:
S19-2152
Volume:
Proceedings of the 13th International Workshop on Semantic Evaluation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota, USA
Editors:
Jonathan May, Ekaterina Shutova, Aurelie Herbelot, Xiaodan Zhu, Marianna Apidianaki, Saif M. Mohammad
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
888–892
Language:
URL:
https://aclanthology.org/S19-2152
DOI:
10.18653/v1/S19-2152
Bibkey:
Cite (ACL):
Masahiro Yamamoto and Toshiyuki Sekiya. 2019. m_y at SemEval-2019 Task 9: Exploring BERT for Suggestion Mining. In Proceedings of the 13th International Workshop on Semantic Evaluation, pages 888–892, Minneapolis, Minnesota, USA. Association for Computational Linguistics.
Cite (Informal):
m_y at SemEval-2019 Task 9: Exploring BERT for Suggestion Mining (Yamamoto & Sekiya, SemEval 2019)
Copy Citation:
PDF:
https://aclanthology.org/S19-2152.pdf