Filter pruning is a widely employed model compression technique, with the inter-channel method currently recognized as the most efficient approach for filter pruning. However, existing inter-channel methods have not fully explored the independence between convolutional channels. In this paper, we propose to use the Schatten p-norm to extract rank information between convolutional channels and measure the importance of a specific channel by analyzing the change in rank information after its removal. The principle underlying our pruning approach is that a smaller change in rank information corresponds to a lesser degree of importance for the channel. Besides, to reduce the computation time required for calculating channel importance, we propose employing a prototype-based approach. We have verified the effectiveness and efficiency of our proposed method on various datasets and models. As an example, when applying our approach to ResNet-56, we achieved an accuracy improvement of 0.91% while the model size and FLOPs were reduced by 42.8% and 47.4% respectively on the CIFAR10 dataset.
|