🤖 AI Summary
Conventional backpropagation (BP) training of convolutional neural networks (CNNs) suffers from high computational overhead and susceptibility to local optima. Method: This paper proposes a quantum-inspired hybrid optimization framework that integrates unconstrained binary quadratic programming (UBQP) modeling with stochastic gradient descent (SGD). It is the first work to formulate CNN parameter optimization as a UBQP problem, thereby enabling global search capability while preserving gradient-based update efficiency. Contribution/Results: Evaluated on MNIST, the method achieves a 10–15% improvement in classification accuracy over standard BP-CNN, with negligible increase in training time. It significantly enhances convergence quality and generalization performance. Extensive experiments demonstrate its efficiency and scalability in high-performance computing (HPC) and large-scale data scenarios. The approach provides a structured, principled pathway for optimizing classical deep learning models, bridging combinatorial optimization and gradient-based learning.
📝 Abstract
Convolutional Neural Networks (CNNs) are pivotal in computer vision and Big Data analytics but demand significant computational resources when trained on large-scale datasets. Conventional training via back-propagation (BP) with losses like Mean Squared Error or Cross-Entropy often requires extensive iterations and may converge sub-optimally. Quantum computing offers a promising alternative by leveraging superposition, tunneling, and entanglement to search complex optimization landscapes more efficiently. In this work, we propose a hybrid optimization method that combines an Unconstrained Binary Quadratic Programming (UBQP) formulation with Stochastic Gradient Descent (SGD) to accelerate CNN training. Evaluated on the MNIST dataset, our approach achieves a 10--15% accuracy improvement over a standard BP-CNN baseline while maintaining similar execution times. These results illustrate the potential of hybrid quantum-classical techniques in High-Performance Computing (HPC) environments for Big Data and Deep Learning. Fully realizing these benefits, however, requires a careful alignment of algorithmic structures with underlying quantum mechanisms.