|国家预印本平台
首页|Negative Sampling for Contrastive Representation Learning: A Review

Negative Sampling for Contrastive Representation Learning: A Review

Negative Sampling for Contrastive Representation Learning: A Review

来源:Arxiv_logoArxiv
英文摘要

The learn-to-compare paradigm of contrastive representation learning (CRL), which compares positive samples with negative ones for representation learning, has achieved great success in a wide range of domains, including natural language processing, computer vision, information retrieval and graph learning. While many research works focus on data augmentations, nonlinear transformations or other certain parts of CRL, the importance of negative sample selection is usually overlooked in literature. In this paper, we provide a systematic review of negative sampling (NS) techniques and discuss how they contribute to the success of CRL. As the core part of this paper, we summarize the existing NS methods into four categories with pros and cons in each genre, and further conclude with several open research questions as future directions. By generalizing and aligning the fundamental NS ideas across multiple domains, we hope this survey can accelerate cross-domain knowledge sharing and motivate future researches for better CRL.

Daxin Jiang、Wayne Xin Zhao、Ming Gong、Ji-Rong Wen、Lanling Xu、Linjun Shou、Jianxun Lian、Xing Xie

计算技术、计算机技术

Daxin Jiang,Wayne Xin Zhao,Ming Gong,Ji-Rong Wen,Lanling Xu,Linjun Shou,Jianxun Lian,Xing Xie.Negative Sampling for Contrastive Representation Learning: A Review[EB/OL].(2022-05-31)[2025-06-23].https://arxiv.org/abs/2206.00212.点此复制

评论