Pruning convolutional neural networks with an attention mechanism for remote sensing image classification

Zhang, S. and Wu, G. and Gu, Junhua and Han, J. (2020) Pruning convolutional neural networks with an attention mechanism for remote sensing image classification. Electronics (Switzerland), 9 (8). pp. 1-19. ISSN 2079-9292

Full text not available from this repository.


Despite the great success of Convolutional Neural Networks (CNNs) in various visual recognition tasks, the high computational and storage costs of such deep networks impede their deployments in real-time remote sensing tasks. To this end, considerable attention has been given to the filter pruning techniques, which enable slimming deep networks with acceptable performance drops and thus implementing them on the remote sensing devices. In this paper, we propose a new scheme, termed Pruning Filter with Attention Mechanism (PFAM), to compress and accelerate traditional CNNs. In particular, a novel correlation-based filter pruning criterion, which explores the long-range dependencies among filters via an attention module, is employed to select the to-be-pruned filters. Distinct from previous methods, the less correlated filters are first pruned after the pruning stage in the current training epoch, and they are reconstructed and updated during the next training epoch. Doing so allows manipulating input data with the maximum information preserved when executing the original training strategy such that the compressed network model can be obtained without the need for the pretrained model. The proposed method is evaluated on three public remote sensing image datasets, and the experimental results demonstrate its superiority, compared to state-of-the-art baselines. Specifically, PFAM achieves a 0.67% accuracy improvement with a 40% model-size reduction on the Aerial Image Dataset (AID) dataset, which is impressive.

Item Type:
Journal Article
Journal or Publication Title:
Electronics (Switzerland)
ID Code:
Deposited By:
Deposited On:
29 Sep 2020 09:55
Last Modified:
03 May 2022 03:03