π€ AI Summary
Large-scale neuromorphic architecture exploration for brain-inspired computing is hindered by low efficiency of SPICE-level analog circuit simulation, poor generalizability of existing behavioral models, and insufficient accuracy in energy/performance modeling.
Method: This paper proposes a data-driven machine learning (ML) surrogate modeling approachβthe first to integrate ML into rapid system-level modeling of mixed-signal neuromorphic systems. The method trains scalable surrogate models of analog crossbar arrays and spiking neurons using SPICE-generated data, enabling high-fidelity prediction of energy consumption, latency, and functional behavior within digital backend flows.
Contribution/Results: Evaluated on MNIST and Spiking MNIST tasks, the surrogate models achieve >1000Γ speedup over SPICE simulation while maintaining β€7% error in energy estimation, β€8% in latency, and β€2% in behavioral accuracy. This breakthrough overcomes traditional simulation bottlenecks and enables efficient, co-optimized neuromorphic architecture design.
π Abstract
Neuromorphic systems using in-memory or event-driven computing are motivated by the need for more energy-efficient processing of artificial intelligence workloads. Emerging neuromorphic architectures aim to combine traditional digital designs with the computational efficiency of analog computing and novel device technologies. A crucial problem in the rapid exploration and co-design of such architectures is the lack of tools for fast and accurate modeling and simulation. Typical mixed-signal design tools integrate a digital simulator with an analog solver like SPICE, which is prohibitively slow for large systems. By contrast, behavioral modeling of analog components is faster, but existing approaches are fixed to specific architectures with limited energy and performance modeling. In this paper, we propose LASANA, a novel approach that leverages machine learning to derive data-driven surrogate models of analog sub-blocks in a digital backend architecture. LASANA uses SPICE-level simulations of a circuit to train ML models that predict circuit energy, performance, and behavior at analog/digital interfaces. Such models can provide energy and performance annotation on top of existing behavioral models or function as replacements to analog simulation. We apply LASANA to an analog crossbar array and a spiking neuron circuit. Running MNIST and spiking MNIST, LASANA surrogates demonstrate up to three orders of magnitude speedup over SPICE, with energy, latency, and behavioral error less than 7%, 8%, and 2%, respectively.