论文标题

学习通过动态渠道传播进行修剪训练

Learning to Prune in Training via Dynamic Channel Propagation

论文作者

Shen, Shibo, Li, Rongpeng, Zhao, Zhifeng, Zhang, Honggang, Zhou, Yugeng

论文摘要

在本文中,我们提出了一种新型的网络训练机制,称为“动态通道传播”,以在训练期间修剪神经网络。特别是,我们根据渠道的显着性水平,在每个卷积层中选择了一个特定的频道,以参与训练时间的正向传播,该渠道被定义为渠道实用程序。相对于所有选定渠道的实用程序值将通过错误的后传播过程同时更新,并将自适应地更改。此外,当训练结束时,保留了具有较高效用值的频道,而效用较低的频道则被丢弃。因此,我们提议的计划同时训练和修剪神经网络。我们在各种代表性基准数据集和高级卷积神经网络(CNN)体系结构(包括VGGNET和RESNET)上进行了经验评估我们的新型培训计划。实验结果验证了我们方法的出色性能和鲁棒性。

In this paper, we propose a novel network training mechanism called "dynamic channel propagation" to prune the neural networks during the training period. In particular, we pick up a specific group of channels in each convolutional layer to participate in the forward propagation in training time according to the significance level of channel, which is defined as channel utility. The utility values with respect to all selected channels are updated simultaneously with the error back-propagation process and will adaptively change. Furthermore, when the training ends, channels with high utility values are retained whereas those with low utility values are discarded. Hence, our proposed scheme trains and prunes neural networks simultaneously. We empirically evaluate our novel training scheme on various representative benchmark datasets and advanced convolutional neural network (CNN) architectures, including VGGNet and ResNet. The experiment results verify the superior performance and robust effectiveness of our approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源