Residual Feature-Reutilization Inception Network

📅 2024-08-01
🏛️ Pattern Recognition
📈 Citations: 2
Influential: 0
📄 PDF
🤖 AI Summary
To address feature redundancy and gradient degradation in deep convolutional networks, this paper proposes an enhanced Inception architecture integrating residual connections with multi-scale feature reuse. The core innovation is a dynamic cross-layer feature reutilization mechanism: within each Inception branch, historical features are iteratively activated and reweighted, guided by channel-wise attention and a gradient-gated unit to improve representational continuity and parameter efficiency. Experiments on ImageNet demonstrate that the proposed method achieves a 1.3% absolute improvement in Top-1 accuracy, reduces model parameters by 18%, accelerates inference speed by 22%, and significantly enhances generalization under few-shot learning settings.

Technology Category

Application Category

Problem

Research questions and friction points this paper is trying to address.

Deep Learning
Image Recognition
Complex Feature Handling
Innovation

Methods, ideas, or system contributions that make the work stand out.

ResFRI
Split-ResFRI
Multi-perspective Feature Extraction
Y
Yuanpeng He
Key Laboratory of High Confidence Software Technologies (Peking University), Ministry of Education, Beijing, 100871, China; School of Computer Science, Peking University, Beijing, 100871, China
Wenjie Song
Wenjie Song
Nanhu Laboratory, Jiaxing, 314000, China
Lijian Li
Lijian Li
Macau university
computer vision
Tianxiang Zhan
Tianxiang Zhan
University of Electronic Science and Technology of China
Time SeriesInformation TheoryComplex SystemsMachine LearningBelief Functions
W
Wenpin Jiao
Key Laboratory of High Confidence Software Technologies (Peking University), Ministry of Education, Beijing, 100871, China; School of Computer Science, Peking University, Beijing, 100871, China