Publications

Detailed Information

Self-guided contrastive learning for BERT sentence representations

Cited 52 time in Web of Science Cited 107 time in Scopus
Authors

Kim, Taeuk; Yoo, Kang Min; Lee, Sang-Goo

Issue Date
2021-08
Publisher
Association for Computational Linguistics (ACL)
Citation
ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference, pp.2528-2540
Abstract
© 2021 Association for Computational LinguisticsAlthough BERT and its variants have reshaped the NLP landscape, it still remains unclear how best to derive sentence embeddings from such pre-trained Transformers. In this work, we propose a contrastive learning method that utilizes self-guidance for improving the quality of BERT sentence representations. Our method fine-tunes BERT in a self-supervised fashion, does not rely on data augmentation, and enables the usual [CLS] token embeddings to function as sentence vectors. Moreover, we redesign the contrastive learning objective (NT-Xent) and apply it to sentence representation learning. We demonstrate with extensive experiments that our approach is more effective than competitive baselines on diverse sentence-related tasks. We also show it is efficient at inference and robust to domain shifts.
URI
https://hdl.handle.net/10371/183728
Files in This Item:
There are no files associated with this item.
Appears in Collections:

Altmetrics

Item View & Download Count

  • mendeley

Items in S-Space are protected by copyright, with all rights reserved, unless otherwise indicated.

Share