论文标题
深层相关分类
Deep Negative Correlation Classification
论文作者
论文摘要
合奏学习是提高几乎所有机器学习算法的性能的直接方法。现有的深层合奏方法通常天真地训练许多不同的模型,然后汇总其预测。从两个方面来看,这并不是最佳选择:i)天真训练多种模型增加了更多的计算负担,尤其是在深度学习时代; ii)纯粹优化每个基本模型而不考虑它们的相互作用限制了合奏和性能增长的多样性。我们通过提出深层的负相关分类(DNCC)来解决这些问题,在这种分类中,通过将损失函数无缝地分解为个人准确性以及单个模型与整体之间的相关性,可以系统地控制准确性和多样性权衡。 DNCC产生一个深层的分类集合,其中单个估计器既准确又具有负相关。得益于优化的多样性,即使利用共享网络骨架,DNCC也可以很好地运行,这与大多数现有的合奏系统相比,它可以显着提高其效率。在多个基准数据集和网络结构上进行了广泛的实验证明了该方法的优越性。
Ensemble learning serves as a straightforward way to improve the performance of almost any machine learning algorithm. Existing deep ensemble methods usually naively train many different models and then aggregate their predictions. This is not optimal in our view from two aspects: i) Naively training multiple models adds much more computational burden, especially in the deep learning era; ii) Purely optimizing each base model without considering their interactions limits the diversity of ensemble and performance gains. We tackle these issues by proposing deep negative correlation classification (DNCC), in which the accuracy and diversity trade-off is systematically controlled by decomposing the loss function seamlessly into individual accuracy and the correlation between individual models and the ensemble. DNCC yields a deep classification ensemble where the individual estimator is both accurate and negatively correlated. Thanks to the optimized diversities, DNCC works well even when utilizing a shared network backbone, which significantly improves its efficiency when compared with most existing ensemble systems. Extensive experiments on multiple benchmark datasets and network structures demonstrate the superiority of the proposed method.