Training Neural Networks by Optimizing Neuron Positions

📅 2025-06-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost, excessive parameter count, and poor interpretability of deep neural networks in edge and real-time scenarios, this paper proposes Spatially Embedded Neural Networks (SEN): neurons are explicitly embedded as learnable coordinates in Euclidean space, replacing conventional weight matrices; synaptic weights are dynamically generated from pairwise Euclidean distances between neurons. This is the first approach to model neuron positions as differentiable, optimizable parameters while incorporating biologically plausible wiring priors, applicable to both multilayer perceptrons (MLPs) and spiking neural networks (SNNs). Experiments on MNIST show SEN matches the accuracy of fully connected baselines while reducing parameters by over 80%; it further enables structural visualization and topological interpretability. Key contributions include: (1) a parameter-efficient architecture driven by spatial coordinates; (2) a distance-coupled dynamic wiring mechanism; and (3) a unified embedding framework compatible with both artificial and spiking neural networks.

Technology Category

Application Category

📝 Abstract
The high computational complexity and increasing parameter counts of deep neural networks pose significant challenges for deployment in resource-constrained environments, such as edge devices or real-time systems. To address this, we propose a parameter-efficient neural architecture where neurons are embedded in Euclidean space. During training, their positions are optimized and synaptic weights are determined as the inverse of the spatial distance between connected neurons. These distance-dependent wiring rules replace traditional learnable weight matrices and significantly reduce the number of parameters while introducing a biologically inspired inductive bias: connection strength decreases with spatial distance, reflecting the brain's embedding in three-dimensional space where connections tend to minimize wiring length. We validate this approach for both multi-layer perceptrons and spiking neural networks. Through a series of experiments, we demonstrate that these spatially embedded neural networks achieve a performance competitive with conventional architectures on the MNIST dataset. Additionally, the models maintain performance even at pruning rates exceeding 80% sparsity, outperforming traditional networks with the same number of parameters under similar conditions. Finally, the spatial embedding framework offers an intuitive visualization of the network structure.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational complexity in resource-constrained environments
Optimizing neuron positions to minimize parameter counts
Maintaining performance with high sparsity in neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimize neuron positions in Euclidean space
Use inverse spatial distance for synaptic weights
Achieve competitive performance with fewer parameters
🔎 Similar Papers
No similar papers found.
L
Laura Erb
FZI Research Center for Information Technology, Karlsruhe, Germany; Karlsruhe Institute of Technology, Karlsruhe, Germany
T
Tommaso Boccato
Department of Biomedicine and Prevention, University of Rome Tor Vergata, Rome, Italy
A
Alexandru Vasilache
FZI Research Center for Information Technology, Karlsruhe, Germany; Karlsruhe Institute of Technology, Karlsruhe, Germany
Juergen Becker
Juergen Becker
Karlsruhe Institute of Technology
Nicola Toschi
Nicola Toschi
Department of Biomedicine and Prevention, University of Rome Tor Vergata
Medical PhysicsNeuroimaging/NeurosciencePhysiological Systems ModelingSignal ProcessingMachine Learning