Statistical Feature-Driven Regularization for Structured Model Pruning

Authors: Jielei Wang, Dongnan Liu, Heng Yin, Kexin Li, Guangchun Luo, and Guoming Lu
Conference: ICIC 2025 Posters, Ningbo, China, July 26-29, 2025
Pages: 1385-1396
Keywords: Structured Pruning· Convolutional Neural Networks· Regularization· Statistical Feature

Abstract

Structured pruning is a highly effective model compression technique that balances accuracy and acceleration, making it widely adopted in the field of convolutional neural networks. Traditional prun- ing methods relying on magnitude-based criteria exhibit limitations in distinguishing critical channels because of narrow parameter distribu- tions in sparse models. Building on this phenomenon, we propose a statistical feature-driven structured pruning framework that integrates dependency-aware group regularization. By incorporating a dependency graph to model inter-layer relationships and leveraging both the mean and variance of channel parameters, we design a dynamic regularization term to reduce both the norm and variance of channels, encouraging uni- form shrinkage. Our approach has been validated through experiments across diverse datasets and model architectures, achieving only a 0.71 accuracy drop on ImageNet compared to the baseline model under sim- ilar FLOPs reduction ratios.
📄 View Full Paper (PDF) 📋 Show Citation