论文标题

从大到小:自适应学习到部分集域

From Big to Small: Adaptive Learning to Partial-Set Domains

论文作者

Cao, Zhangjie, You, Kaichao, Zhang, Ziyang, Wang, Jianmin, Long, Mingsheng

论文摘要

在知识获取和从标记的源域中传播的域适应性目标是在分布偏移的未标记目标域中。尽管如此,跨域共享的相同类空间的共同要求阻碍了域适应部分集域的应用。最近的进步表明,大规模培训丰富的知识的深度训练模型可以解决小规模的各种下游任务。因此,有很大的动力可以使模型从大规模域调整到小型域。本文介绍了部分域适应性(PDA),这是一种学习范式,它放宽了相同的类空间假设,而源类空间则属于目标类空间。首先,我们对部分域的适应性进行了理论分析,该分析发现了估计每个类别的可转移概率以及跨域的每个实例的重要性的重要性。然后,我们提出了具有双层选择策略和对抗适应机制的选择性对抗网络(SAN和SAN ++)。双层选择策略通过模型交替估计的可转移概率,同时将每个类别和每个实例同时进行源监督训练,目标自我训练和源目标对抗适应。对标准部分集合数据集的实验和超类的更具挑战性的任务表明,SAN ++的表现优于几种域的适应方法。

Domain adaptation targets at knowledge acquisition and dissemination from a labeled source domain to an unlabeled target domain under distribution shift. Still, the common requirement of identical class space shared across domains hinders applications of domain adaptation to partial-set domains. Recent advances show that deep pre-trained models of large scale endow rich knowledge to tackle diverse downstream tasks of small scale. Thus, there is a strong incentive to adapt models from large-scale domains to small-scale domains. This paper introduces Partial Domain Adaptation (PDA), a learning paradigm that relaxes the identical class space assumption to that the source class space subsumes the target class space. First, we present a theoretical analysis of partial domain adaptation, which uncovers the importance of estimating the transferable probability of each class and each instance across domains. Then, we propose Selective Adversarial Network (SAN and SAN++) with a bi-level selection strategy and an adversarial adaptation mechanism. The bi-level selection strategy up-weighs each class and each instance simultaneously for source supervised training, target self-training, and source-target adversarial adaptation through the transferable probability estimated alternately by the model. Experiments on standard partial-set datasets and more challenging tasks with superclasses show that SAN++ outperforms several domain adaptation methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源