Seq2SeqPy: A Lightweight and Customizable Toolkit for Neural Sequence-to-Sequence Modeling

Raheel Qader, François Portet, Cyril Labbe


Abstract
We present Seq2SeqPy a lightweight toolkit for sequence-to-sequence modeling that prioritizes simplicity and ability to customize the standard architectures easily. The toolkit supports several known architectures such as Recurrent Neural Networks, Pointer Generator Networks, and transformer model. We evaluate the toolkit on two datasets and we show that the toolkit performs similarly or even better than a very widely used sequence-to-sequence toolkit.
Anthology ID:
2020.lrec-1.882
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
7140–7144
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.882
DOI:
Bibkey:
Cite (ACL):
Raheel Qader, François Portet, and Cyril Labbe. 2020. Seq2SeqPy: A Lightweight and Customizable Toolkit for Neural Sequence-to-Sequence Modeling. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 7140–7144, Marseille, France. European Language Resources Association.
Cite (Informal):
Seq2SeqPy: A Lightweight and Customizable Toolkit for Neural Sequence-to-Sequence Modeling (Qader et al., LREC 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.lrec-1.882.pdf