OCTIS: Comparing and Optimizing Topic models is Simple!

Silvia Terragni, Elisabetta Fersini, Bruno Giovanni Galuzzi, Pietro Tropeano, Antonio Candelieri


Abstract
In this paper, we present OCTIS, a framework for training, analyzing, and comparing Topic Models, whose optimal hyper-parameters are estimated using a Bayesian Optimization approach. The proposed solution integrates several state-of-the-art topic models and evaluation metrics. These metrics can be targeted as objective by the underlying optimization procedure to determine the best hyper-parameter configuration. OCTIS allows researchers and practitioners to have a fair comparison between topic models of interest, using several benchmark datasets and well-known evaluation metrics, to integrate novel algorithms, and to have an interactive visualization of the results for understanding the behavior of each model. The code is available at the following link: https://github.com/MIND-Lab/OCTIS.
Anthology ID:
2021.eacl-demos.31
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations
Month:
April
Year:
2021
Address:
Online
Editors:
Dimitra Gkatzia, Djamé Seddah
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
263–270
Language:
URL:
https://aclanthology.org/2021.eacl-demos.31
DOI:
10.18653/v1/2021.eacl-demos.31
Bibkey:
Cite (ACL):
Silvia Terragni, Elisabetta Fersini, Bruno Giovanni Galuzzi, Pietro Tropeano, and Antonio Candelieri. 2021. OCTIS: Comparing and Optimizing Topic models is Simple!. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations, pages 263–270, Online. Association for Computational Linguistics.
Cite (Informal):
OCTIS: Comparing and Optimizing Topic models is Simple! (Terragni et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-demos.31.pdf
Code
 mind-Lab/octis