Automatic Complementary Separation Pruning Toward Lightweight CNNs

📅 2025-05-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address manual pruning-ratio specification and coarse-grained redundancy identification in CNN model compression, this paper proposes ACSP—an Automatic Channel/Neuron Structured Pruning framework. Its core contributions are: (1) constructing a class-pair separation capability graph to jointly leverage activation-driven importance scoring and complementary clustering for automatic identification of diverse, discriminative channels/neurons per layer; and (2) introducing a knee-point analysis algorithm that adaptively determines the optimal number of retained units per layer without hyperparameter tuning. Evaluated on VGG-16, ResNet-50, and MobileNet-V2 across CIFAR-10, CIFAR-100, and ImageNet, ACSP achieves state-of-the-art accuracy while significantly reducing computational cost—up to 58% fewer FLOPs—and parameter count. The method demonstrates strong generalization across architectures and datasets, and is highly practical for edge-device deployment.

Technology Category

Application Category

📝 Abstract
In this paper, we present Automatic Complementary Separation Pruning (ACSP), a novel and fully automated pruning method for convolutional neural networks. ACSP integrates the strengths of both structured pruning and activation-based pruning, enabling the efficient removal of entire components such as neurons and channels while leveraging activations to identify and retain the most relevant components. Our approach is designed specifically for supervised learning tasks, where we construct a graph space that encodes the separation capabilities of each component with respect to all class pairs. By employing complementary selection principles and utilizing a clustering algorithm, ACSP ensures that the selected components maintain diverse and complementary separation capabilities, reducing redundancy and maintaining high network performance. The method automatically determines the optimal subset of components in each layer, utilizing a knee-finding algorithm to select the minimal subset that preserves performance without requiring user-defined pruning volumes. Extensive experiments on multiple architectures, including VGG-16, ResNet-50, and MobileNet-V2, across datasets like CIFAR-10, CIFAR-100, and ImageNet-1K, demonstrate that ACSP achieves competitive accuracy compared to other methods while significantly reducing computational costs. This fully automated approach not only enhances scalability but also makes ACSP especially practical for real-world deployment by eliminating the need for manually defining the pruning volume.
Problem

Research questions and friction points this paper is trying to address.

Automates pruning to reduce CNN computational costs
Maintains network performance while removing redundant components
Eliminates need for manual pruning volume definition
Innovation

Methods, ideas, or system contributions that make the work stand out.

Automated pruning combining structured and activation-based methods
Graph space encodes component separation for class pairs
Knee-finding algorithm selects optimal minimal component subset
🔎 Similar Papers
No similar papers found.