Quantum feature encoding optimization

📅 2025-12-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In quantum machine learning (QML), input encoding significantly influences model performance; however, existing work predominantly focuses on optimizing quantum circuit architectures while neglecting the role of classical input preprocessing. This paper introduces the “pre-encoding modulation” paradigm: rather than modifying the quantum encoder itself, it employs classical preprocessing—specifically feature ranking, selection, and weighting—to optimize the mapping from classical data to the quantum encoder. This approach enhances model expressivity and generalization without increasing quantum resource overhead. We validate its effectiveness across multiple benchmark datasets, diverse ansätze, and variable circuit sizes. Notably, it achieves substantial performance gains on a real 100-qubit quantum processor. The method provides a lightweight, hardware-friendly, and broadly applicable optimization pathway for QML, bridging classical data engineering and quantum encoding in a principled manner.

Technology Category

Application Category

📝 Abstract
Quantum Machine Learning (QML) holds the promise of enhancing machine learning modeling in terms of both complexity and accuracy. A key challenge in this domain is the encoding of input data, which plays a pivotal role in determining the performance of QML models. In this work, we tackle a largely unaddressed aspect of encoding that is unique to QML modeling -- rather than adjusting the ansatz used for encoding, we consider adjusting how data is conveyed to the ansatz. We specifically implement QML pipelines that leverage classical data manipulation (i.e., ordering, selecting, and weighting features) as a preprocessing step, and evaluate if these aspects of encoding can have a significant impact on QML model performance, and if they can be effectively optimized to improve performance. Our experimental results, applied across a wide variety of data sets, ansatz, and circuit sizes, with a representative QML approach, demonstrate that by optimizing how features are encoded in an ansatz we can substantially and consistently improve the performance of QML models, making a compelling case for integrating these techniques in future QML applications. Finally we demonstrate the practical feasibility of this approach by running it using real quantum hardware with 100 qubit circuits and successfully achieving improved QML modeling performance in this case as well.
Problem

Research questions and friction points this paper is trying to address.

Optimizing quantum feature encoding for QML models
Improving QML performance via classical data preprocessing
Enhancing encoding strategies on real quantum hardware
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimizing feature encoding via classical data preprocessing
Adjusting data conveyance to ansatz rather than ansatz itself
Demonstrating performance improvement on real quantum hardware
🔎 Similar Papers
No similar papers found.
T
Tommaso Fioravanti
IBM Italy
Brian Quanz
Brian Quanz
IBM
machine learningdata miningartificial inteliigence
G
Gabriele Agliardi
IBM Quantum
E
Edgar Andres Ruiz Guzman
IBM Quantum
G
Ginés Carrascal
IBM Quantum
J
Jae-Eun Park
IBM Quantum