🤖 AI Summary
In high-dimensional settings, the Kullback–Leibler (KL) divergence suffers from numerical instability due to mismatched supports, leading to overfitting. To address this, we propose Quantum-Inspired Fidelity Divergence (QIF), the first classical probability divergence explicitly designed by adapting quantum fidelity—a theoretically rigorous, hardware-friendly measure of distribution similarity. Building upon QIF, we further introduce QR-Drop, a differentiable, bounded, and continuous regularization technique. Theoretical analysis demonstrates that QIF provides more robust assessment of distributional discrepancies compared to conventional divergences. Empirical evaluation across multiple benchmark datasets shows that QR-Drop significantly mitigates overfitting and consistently outperforms state-of-the-art regularization methods in classification accuracy and generalization.
📝 Abstract
Kullback--Leibler (KL) divergence is a fundamental measure of the dissimilarity between two probability distributions, but it can become unstable in high-dimensional settings due to its sensitivity to mismatches in distributional support. To address robustness limitations, we propose a novel Quantum-Inspired Fidelity-based Divergence (QIF), leveraging quantum information principles yet efficiently computable on classical hardware. Compared to KL divergence, QIF demonstrates improved numerical stability under partial or near-disjoint support conditions, thereby reducing the need for extensive regularization in specific scenarios. Moreover, QIF admits well-defined theoretical bounds and continuous similarity measures. Building on this, we introduce a novel regularization method, QR-Drop, which utilizes QIF to improve generalization in machine learning models. Empirical results show that QR-Drop effectively mitigates overfitting and outperforms state-of-the-art methods.