Quantum-Inspired Fidelity-based Divergence

📅 2025-01-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional settings, the Kullback–Leibler (KL) divergence suffers from numerical instability due to mismatched supports, leading to overfitting. To address this, we propose Quantum-Inspired Fidelity Divergence (QIF), the first classical probability divergence explicitly designed by adapting quantum fidelity—a theoretically rigorous, hardware-friendly measure of distribution similarity. Building upon QIF, we further introduce QR-Drop, a differentiable, bounded, and continuous regularization technique. Theoretical analysis demonstrates that QIF provides more robust assessment of distributional discrepancies compared to conventional divergences. Empirical evaluation across multiple benchmark datasets shows that QR-Drop significantly mitigates overfitting and consistently outperforms state-of-the-art regularization methods in classification accuracy and generalization.

Technology Category

Application Category

📝 Abstract
Kullback--Leibler (KL) divergence is a fundamental measure of the dissimilarity between two probability distributions, but it can become unstable in high-dimensional settings due to its sensitivity to mismatches in distributional support. To address robustness limitations, we propose a novel Quantum-Inspired Fidelity-based Divergence (QIF), leveraging quantum information principles yet efficiently computable on classical hardware. Compared to KL divergence, QIF demonstrates improved numerical stability under partial or near-disjoint support conditions, thereby reducing the need for extensive regularization in specific scenarios. Moreover, QIF admits well-defined theoretical bounds and continuous similarity measures. Building on this, we introduce a novel regularization method, QR-Drop, which utilizes QIF to improve generalization in machine learning models. Empirical results show that QR-Drop effectively mitigates overfitting and outperforms state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

High-Dimensional Spaces
Kullback-Leibler Divergence
Machine Learning Overfitting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Quantum-Inspired Fidelity Divergence
QR-Drop Method
Overfitting Mitigation
🔎 Similar Papers
No similar papers found.