论文标题

关于从嘈杂标签的强大学习:置换层方法

On Robust Learning from Noisy Labels: A Permutation Layer Approach

论文作者

Alsubaihi, Salman, Alkhrashi, Mohammed, Aljadaany, Raied, Albalawi, Fahad, Ghanem, Bernard

论文摘要

标签噪声的存在对深神经网络(DNN)的训练过程构成了重大挑战(例如概括)。作为一种补救措施,本文介绍了一种称为persll的置换层学习方法,以动态校准DNN的训练过程,并由实例依赖性和实例独立的标签噪声。所提出的方法通过实例依赖性排列层增强了常规DNN的体系结构。该层本质上是置换矩阵的凸组合,每个样品都是动态校准的。置换层的主要目的是纠正减轻标签噪声效果的嘈杂样品的丢失。在本文中,我们提供了两个persll的变体:一个将置换层应用于模型的预测,而另一个将其直接应用于给定的嘈杂标签。此外,我们提供了两个变体之间的理论比较,并表明可以将以前的方法视为一种变体。最后,我们通过实验验证了Permll,并表明它在真实和合成数据集上都实现了最先进的性能。

The existence of label noise imposes significant challenges (e.g., poor generalization) on the training process of deep neural networks (DNN). As a remedy, this paper introduces a permutation layer learning approach termed PermLL to dynamically calibrate the training process of the DNN subject to instance-dependent and instance-independent label noise. The proposed method augments the architecture of a conventional DNN by an instance-dependent permutation layer. This layer is essentially a convex combination of permutation matrices that is dynamically calibrated for each sample. The primary objective of the permutation layer is to correct the loss of noisy samples mitigating the effect of label noise. We provide two variants of PermLL in this paper: one applies the permutation layer to the model's prediction, while the other applies it directly to the given noisy label. In addition, we provide a theoretical comparison between the two variants and show that previous methods can be seen as one of the variants. Finally, we validate PermLL experimentally and show that it achieves state-of-the-art performance on both real and synthetic datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源