论文标题
使用分层组稀疏正规化用于深卷积神经网络的过滤修剪
Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks
论文作者
论文摘要
由于卷积神经网络通常经过冗余参数训练,因此可以减少冗余内核或过滤器以获得紧凑的网络而不降低分类精度。在本文中,我们提出了一种使用层次组稀疏正则化的过滤器修剪方法。在我们以前的工作中显示,层次组稀疏正则化有效地获得了稀疏网络,在该网络中,连接到不必要的通道的过滤器自动接近零。在训练具有层次组稀疏正则化的卷积神经网络之后,根据随机选择的训练样本的分类损失的增加来选择不必要的过滤器,以获得紧凑的网络。结果表明,所提出的方法可以减少CIFAR-10的RESNET的50%以上的参数,而测试样品的准确性仅降低0.3%。同样,对于基线网络的精度,RESNET的34%参数降低了。
Since the convolutional neural networks are often trained with redundant parameters, it is possible to reduce redundant kernels or filters to obtain a compact network without dropping the classification accuracy. In this paper, we propose a filter pruning method using the hierarchical group sparse regularization. It is shown in our previous work that the hierarchical group sparse regularization is effective in obtaining sparse networks in which filters connected to unnecessary channels are automatically close to zero. After training the convolutional neural network with the hierarchical group sparse regularization, the unnecessary filters are selected based on the increase of the classification loss of the randomly selected training samples to obtain a compact network. It is shown that the proposed method can reduce more than 50% parameters of ResNet for CIFAR-10 with only 0.3% decrease in the accuracy of test samples. Also, 34% parameters of ResNet are reduced for TinyImageNet-200 with higher accuracy than the baseline network.