Quantum Graph Transformer for NLP Sentiment Classification

πŸ“… 2025-06-09
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the challenges of modeling structured data, excessive parameter counts, and low sample efficiency in natural language processing, this paper proposes the Quantum Graph Transformer (QGT)β€”the first architecture to embed parameterized quantum circuits into a graph message-passing framework, thereby realizing a quantum-enhanced self-attention mechanism. By synergistically integrating quantum computation with graph neural networks, QGT significantly reduces trainable parameters while improving generalization and few-shot learning performance. Empirical evaluation across five sentiment classification benchmarks demonstrates that QGT consistently outperforms existing quantum NLP models. Compared to classical graph transformers, QGT achieves average accuracy gains of 5.42% on real-world data and 4.76% on synthetic data. Notably, on the Yelp dataset, QGT attains comparable performance using only 50% of the labeled samples, validating its high sample efficiency and practical applicability.

Technology Category

Application Category

πŸ“ Abstract
Quantum machine learning is a promising direction for building more efficient and expressive models, particularly in domains where understanding complex, structured data is critical. We present the Quantum Graph Transformer (QGT), a hybrid graph-based architecture that integrates a quantum self-attention mechanism into the message-passing framework for structured language modeling. The attention mechanism is implemented using parameterized quantum circuits (PQCs), which enable the model to capture rich contextual relationships while significantly reducing the number of trainable parameters compared to classical attention mechanisms. We evaluate QGT on five sentiment classification benchmarks. Experimental results show that QGT consistently achieves higher or comparable accuracy than existing quantum natural language processing (QNLP) models, including both attention-based and non-attention-based approaches. When compared with an equivalent classical graph transformer, QGT yields an average accuracy improvement of 5.42% on real-world datasets and 4.76% on synthetic datasets. Additionally, QGT demonstrates improved sample efficiency, requiring nearly 50% fewer labeled samples to reach comparable performance on the Yelp dataset. These results highlight the potential of graph-based QNLP techniques for advancing efficient and scalable language understanding.
Problem

Research questions and friction points this paper is trying to address.

Develops Quantum Graph Transformer for NLP sentiment classification
Reduces trainable parameters with quantum self-attention mechanism
Improves accuracy and sample efficiency in QNLP models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid graph-based architecture with quantum self-attention
Parameterized quantum circuits reduce trainable parameters
Improved accuracy and sample efficiency in NLP
πŸ”Ž Similar Papers
No similar papers found.
S
Shamminuj Aktar
CCS-3 Information Sciences, Los Alamos National Laboratory, Los Alamos, NM, USA
A
Andreas Bartschi
CCS-3 Information Sciences, Los Alamos National Laboratory, Los Alamos, NM, USA
Abdel-Hameed A. Badawy
Abdel-Hameed A. Badawy
Associate Professor, Klipsch School of Electrical & Computer Engineering, New Mexico State Univ.
Performance Modeling and PredictionHigh Performance ComputingComputer ArchitectureHardware SecurityNetworks on Chip
S
S. Eidenbenz
CCS-3 Information Sciences, Los Alamos National Laboratory, Los Alamos, NM, USA