🤖 AI Summary
Existing studies on differential neurons are confined to small-scale artificial networks (<100 neurons), leaving the synchronization dynamics of large-scale networks unexplored.
Method: We systematically investigate the transient dynamics of large-scale lattices of differential neuron ring oscillators via numerical simulations and nonlinear dynamical systems analysis.
Contribution/Results: We report, for the first time, the spontaneous emergence of Kuramoto-like local synchronization, ordered structures with tunable correlation length, stable phase-inverted domain walls, and coexisting multistable periodic orbits under weak coupling. The correlation length grows over time and can be regulated toward a desired steady-state scale via neural sharing mechanisms. This work uncovers novel collective behaviors in large-scale differential neuron ensembles and establishes a critical theoretical foundation—and a new paradigm—for reservoir computing leveraging transiently ordered structures.
📝 Abstract
Recurrent neural networks (RNNs) are machine learning models widely used for learning temporal relationships. Current state-of-the-art RNNs use integrating or spiking neurons -- two classes of computing units whose outputs depend directly on their internal states -- and accordingly there is a wealth of literature characterizing the behavior of large networks built from these neurons. On the other hand, past research on differentiating neurons, whose outputs are computed from the derivatives of their internal states, remains limited to small hand-designed networks with fewer than one-hundred neurons. Here we show via numerical simulation that large lattices of differentiating neuron rings exhibit local neural synchronization behavior found in the Kuramoto model of interacting oscillators. We begin by characterizing the periodic orbits of uncoupled rings, herein called ring oscillators. We then show the emergence of local correlations between oscillators that grow over time when these rings are coupled together into lattices. As the correlation length grows, transient dynamics arise in which large regions of the lattice settle to the same periodic orbit, and thin domain boundaries separate adjacent, out-of-phase regions. The steady-state scale of these correlated regions depends on how the neurons are shared between adjacent rings, which suggests that lattices of differentiating ring oscillator might be tuned to be used as reservoir computers. Coupled with their simple circuit design and potential for low-power consumption, differentiating neural nets therefore represent a promising substrate for neuromorphic computing that will enable low-power AI applications.