论文标题
用于几次学习的多尺度自适应任务注意网络
Multi-scale Adaptive Task Attention Network for Few-Shot Learning
论文作者
论文摘要
几乎没有学习的目的是对几乎没有标签样本的看不见类别进行分类。最近,基于低级信息度量学习的方法已达到满足性能,因为本地表示(LRS)在可见的和看不见的类之间更加一致。但是,这些方法中的大多数都涉及独立的支持集中的每个类别,这不足以衡量特征之间的关系,尤其是在某个任务中。此外,当在复杂的背景中存在不同尺度的主要对象时,低级基于信息的度量学习方法会受到影响。为了解决这些问题,本文提出了一个新颖的多尺度自适应任务注意网络(MATANET),以进行几次学习。具体来说,我们首先使用多尺度功能生成器在不同尺度上生成多个功能。然后,提出了一个自适应任务注意模块,以在整个任务中选择最重要的LR。之后,使用相似性与类模块和融合层来计算查询图像和支持集之间的关节多尺度相似性。与最先进的方法相比,对流行基准测试的广泛实验清楚地表明了拟议的Matanet的有效性。
The goal of few-shot learning is to classify unseen categories with few labeled samples. Recently, the low-level information metric-learning based methods have achieved satisfying performance, since local representations (LRs) are more consistent between seen and unseen classes. However, most of these methods deal with each category in the support set independently, which is not sufficient to measure the relation between features, especially in a certain task. Moreover, the low-level information-based metric learning method suffers when dominant objects of different scales exist in a complex background. To address these issues, this paper proposes a novel Multi-scale Adaptive Task Attention Network (MATANet) for few-shot learning. Specifically, we first use a multi-scale feature generator to generate multiple features at different scales. Then, an adaptive task attention module is proposed to select the most important LRs among the entire task. Afterwards, a similarity-to-class module and a fusion layer are utilized to calculate a joint multi-scale similarity between the query image and the support set. Extensive experiments on popular benchmarks clearly show the effectiveness of the proposed MATANet compared with state-of-the-art methods.