A Hybrid Classical-Quantum Fine Tuned BERT for Text Classification

📅 2025-11-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high computational cost and hyperparameter sensitivity of BERT fine-tuning in text classification, this paper proposes a classical–quantum hybrid model: an *n*-qubit variational quantum circuit is embedded within the pre-trained BERT architecture, enabling end-to-end co-training of feature encoding and classification layers. This work constitutes the first instance of differentiable quantum circuits being jointly fine-tuned with BERT without modifying its backbone—quantum enhancement is achieved solely through lightweight, trainable quantum modules that augment semantic representation capacity. The model demonstrates cross-dataset adaptability and achieves accuracy comparable to or exceeding state-of-the-art classical models on standard benchmarks (e.g., AG News, SST-2), while maintaining controlled increases in parameter count and inference latency. Empirical results validate the feasibility, effectiveness, and generalization potential of quantum-enhanced approaches in NLP tasks.

Technology Category

Application Category

📝 Abstract
Fine-tuning BERT for text classification can be computationally challenging and requires careful hyper-parameter tuning. Recent studies have highlighted the potential of quantum algorithms to outperform conventional methods in machine learning and text classification tasks. In this work, we propose a hybrid approach that integrates an n-qubit quantum circuit with a classical BERT model for text classification. We evaluate the performance of the fine-tuned classical-quantum BERT and demonstrate its feasibility as well as its potential in advancing this research area. Our experimental results show that the proposed hybrid model achieves performance that is competitive with, and in some cases better than, the classical baselines on standard benchmark datasets. Furthermore, our approach demonstrates the adaptability of classical-quantum models for fine-tuning pre-trained models across diverse datasets. Overall, the hybrid model highlights the promise of quantum computing in achieving improved performance for text classification tasks.
Problem

Research questions and friction points this paper is trying to address.

Hybrid classical-quantum BERT addresses computational challenges in text classification
Quantum circuits enhance BERT model performance on benchmark datasets
Fine-tuning pre-trained models with quantum algorithms improves classification accuracy
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid classical-quantum BERT for text classification
Integrates n-qubit quantum circuit with classical BERT
Fine-tunes pre-trained models across diverse datasets
🔎 Similar Papers
No similar papers found.