Beyond Silicon: Materials, Mechanisms, and Methods for Physical Neural Computing

📅 2026-04-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Silicon-based AI faces fundamental limitations due to energy consumption and data-movement bottlenecks, while physical neural computing lacks a unified framework for cross-platform comparison. This work systematically surveys diverse physical substrates—including memristive devices, photonic circuits, mechanical metamaterials, microfluidic systems, chemical reaction networks, and biological neural tissues—and maps neural computation primitives onto their underlying physical mechanisms. It further introduces, for the first time, a cross-domain, first-order benchmarking methodology that establishes a standardized architecture, training, and evaluation framework grounded in canonical tasks and physically interpretable metrics. The study demonstrates that no single substrate dominates across all dimensions; however, each physical neural system exhibits distinct advantages in specific applications such as ultrafast signal processing, in-memory inference, embodied control, and biochemical decision-making.

Technology Category

Application Category

📝 Abstract
Physical implementations of neural computation now extend far beyond silicon hardware, encompassing substrates such as memristive devices, photonic circuits, mechanical metamaterials, microfluidic networks, chemical reaction systems, and living neural tissue. By exploiting intrinsic physical processes such as charge transport, wave interference, elastic deformation, mass transport, and biochemical regulation, these substrates can realize neural inference and adaptation directly in matter. As silicon GPU-centered AI faces growing energy and data-movement constraints, physical neural computation is becoming increasingly relevant as a complementary path beyond conventional digital accelerators. This trend is driven in particular by pervasive intelligence, i.e., the deployment of on-device and edge AI across large numbers of resource-constrained systems. In such settings, co-locating computation with sensing and memory can reduce data shuttling and improve efficiency. Meanwhile, physical neural approaches have emerged across disparate disciplines, yet progress remains fragmented, with limited shared terminology and few principled ways to compare platforms. This survey unifies the field by mapping neural primitives to substrate-specific mechanisms, analyzing architectural and training paradigms, and identifying key engineering constraints including scalability, precision, programmability, and I/O interfacing overhead. To enable cross-domain comparison, we introduce a first-order benchmarking scheme based on standardized static and dynamic tasks and physically interpretable performance dimensions. We show that no single substrate dominates across the considered dimensions; instead, physical neural systems occupy complementary operating regimes, enabling applications ranging from ultrafast signal processing and in-memory inference to embodied control and in-sample biochemical decision making.
Problem

Research questions and friction points this paper is trying to address.

physical neural computing
cross-platform comparison
fragmented research
unified benchmarking
non-silicon substrates
Innovation

Methods, ideas, or system contributions that make the work stand out.

physical neural computing
cross-domain benchmarking
neuromorphic substrates
intrinsic physical mechanisms
edge AI
🔎 Similar Papers
No similar papers found.