论文标题

FOPRO:很少射击的强大的Webly监督原型学习

FoPro: Few-Shot Guided Robust Webly-Supervised Prototypical Learning

论文作者

Qin, Yulei, Chen, Xingyu, Chen, Chao, Shen, Yunhang, Ren, Bo, Gu, Yun, Yang, Jie, Shen, Chunhua

论文摘要

最近,对Webly监督学习(WSL)进行了研究,以利用互联网的众多且可访问的数据。大多数现有方法都集中在Web图像中学习噪声模型,同时忽略了由Web域和现实世界域之间的差异引起的性能下降。但是,只有通过解决上述性能差距,我们才能充分利用Web数据集的实际价值。为此,我们提出了一些指导的原型(FOPRO)表示方法,该方法只需要一些现实中标记的示例,并且可以显着改善现实世界中的性能。具体而言,我们将每个类中心的中心用很少的现实世界数据初始化为``现实的''原型。然后,通过对比度学习,Web实例和``现实''原型之间的类内距离被缩小。最后,我们用可学习的度量测量图像 - 型距离。原型通过相邻的高质量Web图像抛光,并参与去除遥远的分布样品。在实验中,FOPRO在Web数据集上进行了培训,并在现实世界数据集中进行了一些实际示例和评估。我们的方法在三个细粒数据集和两个大规模数据集上实现了最先进的性能。与现有的WSL方法相比,在相同的少量设置下,FOPRO仍然在现实世界中表现出色。代码可从https://github.com/yuleiqin/fopro获得。

Recently, webly supervised learning (WSL) has been studied to leverage numerous and accessible data from the Internet. Most existing methods focus on learning noise-robust models from web images while neglecting the performance drop caused by the differences between web domain and real-world domain. However, only by tackling the performance gap above can we fully exploit the practical value of web datasets. To this end, we propose a Few-shot guided Prototypical (FoPro) representation learning method, which only needs a few labeled examples from reality and can significantly improve the performance in the real-world domain. Specifically, we initialize each class center with few-shot real-world data as the ``realistic" prototype. Then, the intra-class distance between web instances and ``realistic" prototypes is narrowed by contrastive learning. Finally, we measure image-prototype distance with a learnable metric. Prototypes are polished by adjacent high-quality web images and involved in removing distant out-of-distribution samples. In experiments, FoPro is trained on web datasets with a few real-world examples guided and evaluated on real-world datasets. Our method achieves the state-of-the-art performance on three fine-grained datasets and two large-scale datasets. Compared with existing WSL methods under the same few-shot settings, FoPro still excels in real-world generalization. Code is available at https://github.com/yuleiqin/fopro.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源