Predictive Performance of Deep Quantum Data Re-uploading Models

📅 2025-05-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper identifies a fundamental limitation of quantum data re-uploading models: under high-dimensional input data, increasing the number of encoding layers causes severe degradation in predictive performance—collapsing to random guessing. Method: Through rigorous generalization error analysis and extensive experiments on both synthetic and real-world datasets, we theoretically prove and empirically validate that, under fixed qubit resources, deeper encoding circuits do not improve generalization—and instead induce “performance collapse.” Contribution/Results: We propose a novel quantum circuit design paradigm—“width over depth”—replacing deep, narrow architectures with wide, shallow ones. Experiments across multiple linearly separable tasks demonstrate that this paradigm consistently improves prediction accuracy by 37%–62% on average. Our work establishes the first theoretical foundation and practical guideline for scalable quantum machine learning model design under hardware constraints.

Technology Category

Application Category

📝 Abstract
Quantum machine learning models incorporating data re-uploading circuits have garnered significant attention due to their exceptional expressivity and trainability. However, their ability to generate accurate predictions on unseen data, referred to as the predictive performance, remains insufficiently investigated. This study reveals a fundamental limitation in predictive performance when deep encoding layers are employed within the data re-uploading model. Concretely, we theoretically demonstrate that when processing high-dimensional data with limited-qubit data re-uploading models, their predictive performance progressively degenerates to near random-guessing levels as the number of encoding layers increases. In this context, the repeated data uploading cannot mitigate the performance degradation. These findings are validated through experiments on both synthetic linearly separable datasets and real-world datasets. Our results demonstrate that when processing high-dimensional data, the quantum data re-uploading models should be designed with wider circuit architectures rather than deeper and narrower ones.
Problem

Research questions and friction points this paper is trying to address.

Investigates predictive performance of deep quantum data re-uploading models
Reveals performance degradation in high-dimensional data with deep encoding layers
Recommends wider circuit designs over deeper ones for better performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep quantum data re-uploading circuits
Limited-qubit high-dimensional data processing
Wider circuit architectures over deeper ones
🔎 Similar Papers
No similar papers found.
X
Xin Wang
Department of Automation, Tsinghua University, Beijing, China
H
Han-Xiao Tao
Department of Automation, Tsinghua University, Beijing, China
Re-Bing Wu
Re-Bing Wu
Tsinghua University
quantum controlnonlinear control