Delta Embedding Learning

Xiao Zhang, Ji Wu, Dejing Dou


Abstract
Unsupervised word embeddings have become a popular approach of word representation in NLP tasks. However there are limitations to the semantics represented by unsupervised embeddings, and inadequate fine-tuning of embeddings can lead to suboptimal performance. We propose a novel learning technique called Delta Embedding Learning, which can be applied to general NLP tasks to improve performance by optimized tuning of the word embeddings. A structured regularization is applied to the embeddings to ensure they are tuned in an incremental way. As a result, the tuned word embeddings become better word representations by absorbing semantic information from supervision without “forgetting.” We apply the method to various NLP tasks and see a consistent improvement in performance. Evaluation also confirms the tuned word embeddings have better semantic properties.
Anthology ID:
P19-1322
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3329–3334
Language:
URL:
https://aclanthology.org/P19-1322
DOI:
10.18653/v1/P19-1322
Bibkey:
Cite (ACL):
Xiao Zhang, Ji Wu, and Dejing Dou. 2019. Delta Embedding Learning. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3329–3334, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Delta Embedding Learning (Zhang et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1322.pdf
Data
MultiNLISNLISQuAD