论文标题

基于分离指数的转移学习中预先训练的深度神经网络的排名和拒绝

Ranking and Rejecting of Pre-Trained Deep Neural Networks in Transfer Learning based on Separation Index

论文作者

Kalhor, Mostafa, Kalhor, Ahmad, Rahmani, Mehdi

论文摘要

预先训练的深神经网络(DNN)的自动排名减少了选择最佳预训练的DNN的所需时间,并提高了转移学习中的分类性能。在本文中,我们引入了一种新颖的算法,通过将基于直接距离的复杂度度量(SI)应用于目标数据集来对预训练的DNN进行排名。为此,首先给出了有关SI的背景,然后解释了自动排名算法。在此算法中,对目标数据集进行了SI计算,该目标数据集是从提取预训练DNN的部分的特征中传递的。然后,通过降低计算的SIS,可以轻松对预训练的DNN进行排名。在这种排名方法中,最佳DNN在目标数据集上最大程度地si,并且在足够低计算的SIS的情况下,可能会拒绝一些预训练的DNN。通过使用包括Linnaeus 5,乳腺癌图像和Covid-CT在内的三个具有挑战性的数据集评估所提出算法的效率。对于两个第一个案例研究,所提出的算法的结果与训练有素的DNN的排名完全匹配,该算法通过目标数据集上的精度与训练有素的DNN相匹配。对于第三个案例研究,尽管对目标数据使用了不同的预处理,但该算法的排名与分类精度所致的排名具有很高的相关性。

Automated ranking of pre-trained Deep Neural Networks (DNNs) reduces the required time for selecting optimal pre-trained DNN and boost the classification performance in transfer learning. In this paper, we introduce a novel algorithm to rank pre-trained DNNs by applying a straightforward distance-based complexity measure named Separation Index (SI) to the target dataset. For this purpose, at first, a background about the SI is given and then the automated ranking algorithm is explained. In this algorithm, the SI is computed for the target dataset which passes from the feature extracting parts of pre-trained DNNs. Then, by descending sort of the computed SIs, the pre-trained DNNs are ranked, easily. In this ranking method, the best DNN makes maximum SI on the target dataset and a few pre-trained DNNs may be rejected in the case of their sufficiently low computed SIs. The efficiency of the proposed algorithm is evaluated by using three challenging datasets including Linnaeus 5, Breast Cancer Images, and COVID-CT. For the two first case studies, the results of the proposed algorithm exactly match with the ranking of the trained DNNs by the accuracy on the target dataset. For the third case study, despite using different preprocessing on the target data, the ranking of the algorithm has a high correlation with the ranking resulted from classification accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源