Feedback Alignment Meets Low-Rank Manifolds: A Structured Recipe for Local Learning

📅 2025-10-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Backpropagation (BP) incurs substantial memory and computational overhead, while Direct Feedback Alignment (DFA) suffers from performance degradation in deep CNNs due to unstructured feedback signals. Method: This paper proposes a structured local learning framework grounded in low-rank manifolds. It innovatively integrates DFA with singular value decomposition (SVD)-based low-rank weight representations, establishing a hierarchical training mechanism with consistent forward and feedback pathways. Subspace alignment loss and orthogonality regularization are jointly optimized with the cross-entropy objective. Contribution/Results: The method enables fully local, parallel parameter updates without pruning or post-hoc compression. On CIFAR-10/100 and ImageNet, it achieves accuracy comparable to BP while significantly reducing memory footprint and parameter count. Experimental results validate its efficiency, scalability, and structural consistency—demonstrating that structured, low-rank feedback can effectively replace dense, global error propagation.

Technology Category

Application Category

📝 Abstract
Training deep neural networks (DNNs) with backpropagation (BP) achieves state-of-the-art accuracy but requires global error propagation and full parameterization, leading to substantial memory and computational overhead. Direct Feedback Alignment (DFA) enables local, parallelizable updates with lower memory requirements but is limited by unstructured feedback and poor scalability in deeper architectures, specially convolutional neural networks. To address these limitations, we propose a structured local learning framework that operates directly on low-rank manifolds defined by the Singular Value Decomposition (SVD) of weight matrices. Each layer is trained in its decomposed form, with updates applied to the SVD components using a composite loss that integrates cross-entropy, subspace alignment, and orthogonality regularization. Feedback matrices are constructed to match the SVD structure, ensuring consistent alignment between forward and feedback pathways. Our method reduces the number of trainable parameters relative to the original DFA model, without relying on pruning or post hoc compression. Experiments on CIFAR-10, CIFAR-100, and ImageNet show that our method achieves accuracy comparable to that of BP. Ablation studies confirm the importance of each loss term in the low-rank setting. These results establish local learning on low-rank manifolds as a principled and scalable alternative to full-rank gradient-based training.
Problem

Research questions and friction points this paper is trying to address.

Reducing memory and computational overhead in deep neural network training
Improving scalability of local learning in deep convolutional architectures
Enabling structured feedback alignment through low-rank weight manifolds
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses SVD for low-rank weight matrix decomposition
Applies structured feedback matching SVD components
Trains layers locally with composite loss functions
🔎 Similar Papers
No similar papers found.