🤖 AI Summary
Conventional complex-valued Hopfield neural networks (CvHNNs) suffer from limited state capacity, hindering their effectiveness in associative memory and pattern recognition. Method: This paper proposes two novel CvHNN models featuring a joint phase-and-magnitude quantization mechanism—the first of its kind—implemented via piecewise-constant activation functions with upper-bound operations, designed respectively in rectangular and polar coordinate systems to enable efficient discretization of complex-valued states. Contribution/Results: The proposed quantization significantly increases the number of distinguishable stable states, overcoming the fundamental state-capacity bottleneck of traditional CvHNNs. Experiments demonstrate that the number of attainable stable states grows exponentially with quantization precision, while preserving strong convergence properties and fault tolerance. This work establishes a new paradigm for enhancing the storage capacity of complex-valued neural networks and broadens their applicability in high-dimensional information processing tasks.
📝 Abstract
This research paper introduces two novel complex-valued Hopfield neural networks (CvHNNs) that incorporate phase and magnitude quantization. The first CvHNN employs a ceiling-type activation function that operates on the rectangular coordinate representation of the complex net contribution. The second CvHNN similarly incorporates phase and magnitude quantization but utilizes a ceiling-type activation function based on the polar coordinate representation of the complex net contribution. The proposed CvHNNs, with their phase and magnitude quantization, significantly increase the number of states compared to existing models in the literature, thereby expanding the range of potential applications for CvHNNs.