PrunedCaps: A Case For Primary Capsules Discrimination

📅 2025-12-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Capsule Networks (CapsNets) offer affine robustness and superior performance on overlapping objects, but their high computational cost and slow inference stem from the large number of Primary Capsules (PCs). Method: This work presents the first systematic study of PC pruning across MNIST, Fashion-MNIST, CIFAR-10, and SVHN. We propose an importance-based selective pruning strategy that jointly optimizes structural sparsity and dynamic routing efficiency—requiring no fine-tuning post-pruning. Contribution/Results: Our approach achieves up to 95.2% reduction in capsule count and 95.36% reduction in dynamic routing operations, yielding up to 9.90× inference speedup with zero accuracy degradation. Empirical analysis uncovers a dataset-complexity–PC-redundancy correlation, establishing a reproducible, cross-dataset pruning paradigm for lightweight CapsNets.

Technology Category

Application Category

📝 Abstract
Capsule Networks (CapsNets) are a generation of image classifiers with proven advantages over Convolutional Neural Networks (CNNs). Better robustness to affine transformation and overlapping image detection are some of the benefits associated with CapsNets. However, CapsNets cannot be classified as resource-efficient deep learning architecture due to the high number of Primary Capsules (PCs). In addition, CapsNets' training and testing are slow and resource hungry. This paper investigates the possibility of Primary Capsules pruning in CapsNets on MNIST handwritten digits, Fashion-MNIST, CIFAR-10, and SVHN datasets. We show that a pruned version of CapsNet performs up to 9.90 times faster than the conventional architecture by removing 95 percent of Capsules without a loss of accuracy. Also, our pruned architecture saves on more than 95.36 percent of floating-point operations in the dynamic routing stage of the architecture. Moreover, we provide insight into why some datasets benefit significantly from pruning while others fall behind.
Problem

Research questions and friction points this paper is trying to address.

Reducing Capsule Networks' high computational cost
Pruning Primary Capsules to improve efficiency
Analyzing dataset-specific pruning effectiveness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pruning Primary Capsules for speedup
Reducing floating-point operations in routing
Maintaining accuracy while removing capsules
🔎 Similar Papers
No similar papers found.
R
Ramin Sharifi
Department of Electrical and Computer Engineering, University of Victoria, 3800 Finnerty Rd, Victoria, BC, V8P 5J2 Canada
Pouya Shiri
Pouya Shiri
University of Saskatchewan
Medical ImagingDiffusion ModelsGenerative AI
Amirali Baniasadi
Amirali Baniasadi
University of Victoria
Computer Architecture