Shot-Based Quantum Encoding: A Data-Loading Paradigm for Quantum Neural Networks

📅 2026-04-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of low data encoding efficiency and constrained circuit depth in current quantum machine learning by proposing a novel measurement-shot-based data embedding method. Specifically, the number of shots is treated as a learnable parameter, and initial quantum resources are allocated according to a classical data-driven probability distribution, thereby constructing a mixed-state representation that eliminates the need for explicit encoding gates. This approach, for the first time, incorporates shots directly into the encoding process and is structurally equivalent to a quantum-weight-implemented multilayer perceptron when combined with nonlinear activation functions. Experimental results demonstrate test accuracies of 89.1% ± 0.9% on Semeion and 80.95% ± 0.10% on Fashion MNIST, significantly outperforming conventional amplitude encoding and linear MLP baselines.
📝 Abstract
Efficient data loading remains a bottleneck for near-term quantum machine-learning. Existing schemes (angle, amplitude, and basis encoding) either underuse the exponential Hilbert-space capacity or require circuit depths that exceed the coherence budgets of noisy intermediate-scale quantum hardware. We introduce Shot-Based Quantum Encoding (SBQE), a data embedding strategy that distributes the hardware's native resource, shots, according to a data-dependent classical distribution over multiple initial quantum states. By treating the shot counts as a learnable degree of freedom, SBQE produces a mixed-state representation whose expectation values are linear in the classical probabilities and can therefore be composed with non-linear activation functions. We show that SBQE is structurally equivalent to a multilayer perceptron whose weights are realised by quantum circuits, and we describe a hardware-compatible implementation protocol. Benchmarks on Fashion MNIST and Semeion handwritten digits, with ten independent initialisations per model, show that SBQE achieves 89.1% +/- 0.9% test accuracy on Semeion (reducing error by 5.3% relative to amplitude encoding and matching a width-matched classical network) and 80.95% +/- 0.10% on Fashion MNIST (exceeding amplitude encoding by +2.0% and a linear multilayer perceptron by +1.3%), all without any data-encoding gates.
Problem

Research questions and friction points this paper is trying to address.

quantum machine learning
data loading
quantum encoding
NISQ hardware
Hilbert space
Innovation

Methods, ideas, or system contributions that make the work stand out.

Shot-Based Quantum Encoding
quantum data loading
mixed-state representation
quantum neural networks
NISQ hardware
🔎 Similar Papers
No similar papers found.
B
Basil Kyriacou
Terra Quantum AG, 9000 St. Gallen, Switzerland
V
Viktoria Patapovich
Terra Quantum AG, 9000 St. Gallen, Switzerland
Maniraman Periyasamy
Maniraman Periyasamy
Fraunhofer Institute for Integrated Circuits IIS
Quantum ComputationQuantum Machine LearningReinforcement LearningDeep Learning
A
Alexey Melnikov
Terra Quantum AG, 9000 St. Gallen, Switzerland