Utilizing Character and Word Embeddings for Text Normalization with Sequence-to-Sequence Models

Daniel Watson, Nasser Zalmout, Nizar Habash


Abstract
Text normalization is an important enabling technology for several NLP tasks. Recently, neural-network-based approaches have outperformed well-established models in this task. However, in languages other than English, there has been little exploration in this direction. Both the scarcity of annotated data and the complexity of the language increase the difficulty of the problem. To address these challenges, we use a sequence-to-sequence model with character-based attention, which in addition to its self-learned character embeddings, uses word embeddings pre-trained with an approach that also models subword information. This provides the neural model with access to more linguistic information especially suitable for text normalization, without large parallel corpora. We show that providing the model with word-level features bridges the gap for the neural network approach to achieve a state-of-the-art F1 score on a standard Arabic language correction shared task dataset.
Anthology ID:
D18-1097
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
837–843
Language:
URL:
https://aclanthology.org/D18-1097
DOI:
10.18653/v1/D18-1097
Bibkey:
Cite (ACL):
Daniel Watson, Nasser Zalmout, and Nizar Habash. 2018. Utilizing Character and Word Embeddings for Text Normalization with Sequence-to-Sequence Models. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 837–843, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Utilizing Character and Word Embeddings for Text Normalization with Sequence-to-Sequence Models (Watson et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1097.pdf