🤖 AI Summary
Current quantum machine learning approaches predominantly adapt classical models, lacking genuine quantum-native design principles.
Method: This paper introduces Quantum Hyperdimensional Computing (QHDC), the first framework to establish direct quantum-native mappings for core hyperdimensional operations—binding, bundling, and similarity retrieval—leveraging Linear Combination of Unitaries (LCU), Oblivious Amplitude Amplification (OAA), quantum phase oracles, Quantum Fourier Transform (QFT), and Hadamard testing.
Contribution/Results: QHDC constitutes a theoretically grounded, scalable, brain-inspired quantum learning paradigm. We experimentally validate it on the 156-qubit IBM Heron r3 processor, successfully executing symbolic analogical reasoning and classification tasks. Both classical simulation and real-device execution confirm QHDC’s effectiveness, robustness, and hardware feasibility—establishing the first deployable quantum hyperdimensional architecture for neuromorphic quantum computing.
📝 Abstract
A significant challenge in quantum computing (QC) is developing learning models that truly align with quantum principles, as many current approaches are complex adaptations of classical frameworks. In this work, we introduce Quantum Hyperdimensional Computing (QHDC), a fundamentally new paradigm. We demonstrate that the core operations of its classical counterpart, Hyperdimensional Computing (HDC), a brain-inspired model, map with remarkable elegance and direct correspondence onto the native operations of a QC. This suggests HDC is exceptionally well-suited for a quantum-native implementation. We establish a direct, resource-efficient mapping: (i) hypervectors are mapped to quantum states, (ii) the bundling operation is implemented as a quantum-native averaging process using a Linear Combination of Unitaries (LCU) and Oblivious Amplitude Amplification (OAA), (iii) the binding operation is realized via quantum phase oracles, (iv) the permutation operation is implemented using the Quantum Fourier Transform (QFT), and (v) vector similarity is calculated using quantum state fidelity measurements based on the Hadamard Test. We present the first-ever implementation of this framework, validated through symbolic analogical reasoning and supervised classification tasks. The viability of QHDC is rigorously assessed via a comparative analysis of results from classical computation, ideal quantum simulation, and execution of a 156-qubit IBM Heron r3 quantum processor. Our results validate the proposed mappings and demonstrate the versatility of the framework, establishing QHDC as a physically realizable technology. This work lays the foundation for a new class of quantum neuromorphic algorithms and opens a promising avenue for tackling complex cognitive and biomedical problems intractable for classical systems.