🤖 AI Summary
In quantum machine learning (QML), variational quantum circuits (VQCs) can generate classical neural network parameters for hardware-free inference and parameter compression; however, their manual design requires specialized quantum expertise, hindering practical deployment. To address this, we propose the first differentiable quantum architecture search (DQAS) framework, introducing continuous relaxation and automatic differentiation into VQC structural optimization. Our method jointly learns circuit topology and parameters end-to-end without human intervention, integrating quantum amplitude encoding with classical-quantum co-training. Evaluated on classification, time-series forecasting, and reinforcement learning tasks, it matches or surpasses manually designed quantum neural networks (QNNs) while achieving significantly higher parameter compression—entirely on classical hardware. The core contribution is the first differentiable, automated design of VQC architectures, enabling cross-task generalization and eliminating reliance on quantum devices throughout training and inference.
📝 Abstract
The rapid advancements in quantum computing (QC) and machine learning (ML) have led to the emergence of quantum machine learning (QML), which integrates the strengths of both fields. Among QML approaches, variational quantum circuits (VQCs), also known as quantum neural networks (QNNs), have shown promise both empirically and theoretically. However, their broader adoption is hindered by reliance on quantum hardware during inference. Hardware imperfections and limited access to quantum devices pose practical challenges. To address this, the Quantum-Train (QT) framework leverages the exponential scaling of quantum amplitudes to generate classical neural network parameters, enabling inference without quantum hardware and achieving significant parameter compression. Yet, designing effective quantum circuit architectures for such quantum-enhanced neural programmers remains non-trivial and often requires expertise in quantum information science. In this paper, we propose an automated solution using differentiable optimization. Our method jointly optimizes both conventional circuit parameters and architectural parameters in an end-to-end manner via automatic differentiation. We evaluate the proposed framework on classification, time-series prediction, and reinforcement learning tasks. Simulation results show that our method matches or outperforms manually designed QNN architectures. This work offers a scalable and automated pathway for designing QNNs that can generate classical neural network parameters across diverse applications.