🤖 AI Summary
Quantum neural networks (QNNs) embedded in high-dimensional tree tensor network (TTN) image classifiers face critical bottlenecks: difficulty implementing high-order quantum gates and low post-selection success probability in intermediate-depth circuits. Method: We propose the Forest Tensor Network (FTN) classifier, which enhances robustness and trainability by ensembling multiple low-bond-dimension TTNs; further, we introduce the quantum FTN (qFTN), the first framework to integrate extended adiabatic encoding—thereby circumventing exponential decay in post-selection success—and enabling efficient quantum embedding. A classical–quantum co-training framework supports end-to-end differentiable optimization. Results: On MNIST and CIFAR-10, qFTN matches or slightly exceeds classical FTN accuracy while substantially improving quantum circuit feasibility and scalability. This work provides the first empirical demonstration of a practical, quantum-enhanced pathway for multi-class image classification.
📝 Abstract
Tree tensor networks (TTNs) offer powerful models for image classification. While these TTN image classifiers already show excellent performance on classical hardware, embedding them into quantum neural networks (QNNs) may further improve the performance by leveraging quantum resources. However, embedding TTN classifiers into QNNs for multiclass classification remains challenging. Key obstacles are the highorder gate operations required for large bond dimensions and the mid-circuit postselection with exponentially low success rates necessary for the exact embedding. In this work, to address these challenges, we propose forest tensor network (FTN)-classifiers, which aggregate multiple small-bond-dimension TTNs. This allows us to handle multiclass classification without requiring large gates in the embedded circuits. We then remove the overhead of mid-circuit postselection by extending the adiabatic encoding framework to our setting and smoothly encode the FTN-classifiers into a quantum forest tensor network (qFTN)- classifiers. Numerical experiments on MNIST and CIFAR-10 demonstrate that we can successfully train FTN-classifiers and encode them into qFTN-classifiers, while maintaining or even improving the performance of the pre-trained FTN-classifiers. These results suggest that synergy between TTN classification models and QNNs can provide a robust and scalable framework for multiclass quantum-enhanced image classification.