Filter pruning with uniqueness mechanism in the frequency domain for efficient neural networks

Zhang, S. and Gao, M. and Ni, Q. and Han, J. (2023) Filter pruning with uniqueness mechanism in the frequency domain for efficient neural networks. Neurocomputing, 530. pp. 116-124. ISSN 0925-2312

[thumbnail of FPUM_v7_clean]
Text (FPUM_v7_clean)
FPUM_v7_clean.pdf - Accepted Version
Available under License Creative Commons Attribution-NonCommercial-NoDerivs.

Download (1MB)

Abstract

Filter pruning has drawn extensive attention due to its advantage in reducing computational costs and memory requirements of deep convolutional neural networks. However, most existing methods only prune filters based on their intrinsic properties or spatial feature maps, ignoring the correlation between filters. In this paper, we suggest the correlation is valuable and consider it from a novel view: the frequency domain. Specifically, we first transfer features to the frequency domain by Discrete Cosine Transform (DCT). Then, for each feature map, we compute a uniqueness score, which measures its probability of being replaced by others. This way allows to prune the filters corresponding to the low-uniqueness maps without significant performance degradation. Compared to the methods focusing on intrinsic properties, our proposed method introduces a more comprehensive criterion to prune filters, further improving the network compactness while preserving good performance. In addition, our method is more robust against noise than the spatial ones since the critical clues for pruning are more concentrated after DCT. Experimental results demonstrate the superiority of our method. To be specific, our method outperforms the baseline ResNet-56 by 0.38% on CIFAR-10 while reducing the floating-point operations (FLOPs) by 47.4%. In addition, a consistent improvement can be observed when pruning the baseline ResNet-110: 0.23% performance increase and up to 71% FLOPs drop. Finally, on ImageNet, our method reduces the FLOPs of the baseline ResNet-50 by 48.7% with only 0.32% accuracy loss.

Item Type:
Journal Article
Journal or Publication Title:
Neurocomputing
Additional Information:
This is the author’s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Neurocomputing, 530, 2023 DOI: 10.1016/j.neucom.2023.02.004
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/1700/1702
Subjects:
?? computer visiondeep learningfrequency-domain transformationimage classificationmodel compressionconvolutional neural networksdeep neural networksdigital arithmeticfrequency domain analysisfeature mapfloating point operationsfrequency domain transformation ??
ID Code:
187546
Deposited By:
Deposited On:
28 Feb 2023 14:05
Refereed?:
Yes
Published?:
Published
Last Modified:
26 Sep 2024 13:05