DNR-Pruning : Sparsity-Aware Pruning via Dying Neuron Reactivation in Convolutional Neural Networks

Wang, Boyuan and Jiang, R. (2025) DNR-Pruning : Sparsity-Aware Pruning via Dying Neuron Reactivation in Convolutional Neural Networks. Transactions on Machine Learning Research, 2025-S.

Full text not available from this repository.

Abstract

In this paper, we challenge the conventional view of dead neurons—neurons that cease to activate—during deep neural network training. Traditionally regarded as problematic due to their association with optimization challenges and reduced model adaptability over training epochs, dead neurons are often seen as a hindrance. However, we present a novel perspective, demonstrating that they can be effectively leveraged to enhance network sparsity. Specifically, we propose DNR-Pruning, dying neuron reactivation based sparsity-aware pruning approach for convolutional neural networks (CNNs) that exploits the behavior of individual neurons during training. Through a systematic exploration of hyperparameter configurations, we show that dying neurons can be harnessed to improve pruning algorithms. Our method dynamically monitors the occurrence of dying neurons, enabling adaptive sparsification throughout CNN training. Extensive experiments on diverse datasets demonstrate that DNRPruning outperforms existing sparsity-aware pruning techniques while achieving competitive results compared to state-of-the-art methods. These findings suggest that dying neurons can serve as an efficient mechanism for network compression and resource optimization in CNNs, opening new avenues for more efficient and high-performance deep learning models.

Item Type:
Journal Article
Journal or Publication Title:
Transactions on Machine Learning Research
ID Code:
232763
Deposited By:
Deposited On:
08 Oct 2025 13:50
Refereed?:
Yes
Published?:
Published
Last Modified:
08 Oct 2025 22:20