论文标题

对比度表示学习的负面抽样:评论

Negative Sampling for Contrastive Representation Learning: A Review

论文作者

Xu, Lanling, Lian, Jianxun, Zhao, Wayne Xin, Gong, Ming, Shou, Linjun, Jiang, Daxin, Xie, Xing, Wen, Ji-Rong

论文摘要

对比代表学习(CRL)的学习到兼容范式(CRL)将积极样本与代表学习的负面样本进行了比较,在各种领域取得了巨大的成功,包括自然语言处理,计算机视觉,信息检索和图形学习。尽管许多研究的重点是数据增强,非线性转换或CRL的其他某些部分,但在文献中通常会忽略样本的重要性。在本文中,我们对负面采样(NS)技术进行了系统的综述,并讨论了它们如何为CRL的成功做出贡献。作为本文的核心部分,我们将现有的NS方法总结为每种类型中的利弊的四个类别,并以几个开放研究问题作为未来的方向进一步结束。通过概括和调整跨多个领域的基本NS思想,我们希望这项调查能够加速跨域知识共享,并激发未来的研究以获得更好的CRL。

The learn-to-compare paradigm of contrastive representation learning (CRL), which compares positive samples with negative ones for representation learning, has achieved great success in a wide range of domains, including natural language processing, computer vision, information retrieval and graph learning. While many research works focus on data augmentations, nonlinear transformations or other certain parts of CRL, the importance of negative sample selection is usually overlooked in literature. In this paper, we provide a systematic review of negative sampling (NS) techniques and discuss how they contribute to the success of CRL. As the core part of this paper, we summarize the existing NS methods into four categories with pros and cons in each genre, and further conclude with several open research questions as future directions. By generalizing and aligning the fundamental NS ideas across multiple domains, we hope this survey can accelerate cross-domain knowledge sharing and motivate future researches for better CRL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源