🤖 AI Summary
Most existing neural networks—including quantum neural networks (QNNs)—employ static architectures, limiting their capacity to model temporal dynamics and long-range dependencies. To address this, we propose two dynamic quantum neural network architectures: the Liquid Quantum Neural Network (LQNet) and the Continuous-Time Recurrent Quantum Neural Network (CTRQNet). These models uniquely integrate continuous-time dynamical systems, recurrent computation, and parameterized quantum circuits, enabling intrinsic time evolution via pulsed quantum state dynamics and differentiable quantum simulation. Evaluated on a binary classification task using CIFAR-10, our models achieve up to a 40% absolute accuracy improvement over state-of-the-art QNNs. Moreover, they exhibit enhanced interpretability and practical performance. This work establishes a novel paradigm for designing quantum machine learning models endowed with dynamic intelligence, bridging continuous-time modeling and quantum computation in a unified, differentiable framework.
📝 Abstract
Neural networks have continued to gain prevalence in the modern era for their ability to model complex data through pattern recognition and behavior remodeling. However, the static construction of traditional neural networks inhibits dynamic intelligence. This makes them inflexible to temporal changes in data and unfit to capture complex dependencies. With the advent of quantum technology, there has been significant progress in creating quantum algorithms. In recent years, researchers have developed quantum neural networks that leverage the capabilities of qubits to outperform classical networks. However, their current formulation exhibits a static construction limiting the system's dynamic intelligence. To address these weaknesses, we develop a Liquid Quantum Neural Network (LQNet) and a Continuous Time Recurrent Quantum Neural Network (CTRQNet). Both models demonstrate a significant improvement in accuracy compared to existing quantum neural networks (QNNs), achieving accuracy increases as high as 40% on CIFAR 10 through binary classification. We propose LQNets and CTRQNets might shine a light on quantum machine learning's black box.