Contextual Augmentation: Data Augmentation by Words with Paradigmatic Relations

Sosuke Kobayashi


Abstract
We propose a novel data augmentation for labeled sentences called contextual augmentation. We assume an invariance that sentences are natural even if the words in the sentences are replaced with other words with paradigmatic relations. We stochastically replace words with other words that are predicted by a bi-directional language model at the word positions. Words predicted according to a context are numerous but appropriate for the augmentation of the original words. Furthermore, we retrofit a language model with a label-conditional architecture, which allows the model to augment sentences without breaking the label-compatibility. Through the experiments for six various different text classification tasks, we demonstrate that the proposed method improves classifiers based on the convolutional or recurrent neural networks.
Anthology ID:
N18-2072
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
452–457
Language:
URL:
https://aclanthology.org/N18-2072
DOI:
10.18653/v1/N18-2072
Bibkey:
Cite (ACL):
Sosuke Kobayashi. 2018. Contextual Augmentation: Data Augmentation by Words with Paradigmatic Relations. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 452–457, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Contextual Augmentation: Data Augmentation by Words with Paradigmatic Relations (Kobayashi, NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-2072.pdf
Code
 pfnet-research/contextual_augmentation +  additional community code
Data
MPQA Opinion CorpusSSTSST-5