Quantum Variational Activation Functions Empower Kolmogorov-Arnold Networks

📅 2025-09-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the low parameter efficiency and the trade-off between expressivity and interpretability in quantum machine learning. We propose Quantum Variational Activation Functions (QVAFs) and introduce Quantum-inspired Kolmogorov–Arnold Networks (QKANs). Methodologically, QKANs leverage Data-Reuploading Adaptive Unitary Architectures (DARUANs) for efficient single-qubit spectral expansion, integrating variational quantum circuits, trainable preprocessing, and layer-wise expansion to realize both fully quantum (QKAN) and hybrid (HQKAN) architectures. Our contributions include: (i) the first integration of Kolmogorov–Arnold network interpretability with quantum variational optimization; (ii) exponential spectral growth alongside substantial parameter reduction; and (iii) empirical validation across function regression, image classification, and autoregressive language modeling—demonstrating superior expressivity, generalization, and compatibility with NISQ hardware. This establishes a scalable, interpretable paradigm for quantum machine learning.

Technology Category

Application Category

📝 Abstract
Variational quantum circuits (VQCs) are central to quantum machine learning, while recent progress in Kolmogorov-Arnold networks (KANs) highlights the power of learnable activation functions. We unify these directions by introducing quantum variational activation functions (QVAFs), realized through single-qubit data re-uploading circuits called DatA Re-Uploading ActivatioNs (DARUANs). We show that DARUAN with trainable weights in data pre-processing possesses an exponentially growing frequency spectrum with data repetitions, enabling an exponential reduction in parameter size compared with Fourier-based activations without loss of expressivity. Embedding DARUAN into KANs yields quantum-inspired KANs (QKANs), which retain the interpretability of KANs while improving their parameter efficiency, expressivity, and generalization. We further introduce two novel techniques to enhance scalability, feasibility and computational efficiency, such as layer extension and hybrid QKANs (HQKANs) as drop-in replacements of multi-layer perceptrons (MLPs) for feed-forward networks in large-scale models. We provide theoretical analysis and extensive experiments on function regression, image classification, and autoregressive generative language modeling, demonstrating the efficiency and scalability of QKANs. DARUANs and QKANs offer a promising direction for advancing quantum machine learning on both noisy intermediate-scale quantum (NISQ) hardware and classical quantum simulators.
Problem

Research questions and friction points this paper is trying to address.

Develop quantum variational activation functions to enhance neural networks
Improve parameter efficiency and expressivity in Kolmogorov-Arnold networks
Enable scalable quantum machine learning on NISQ hardware and simulators
Innovation

Methods, ideas, or system contributions that make the work stand out.

DARUAN circuits enable exponential parameter reduction
QKANs combine quantum activation with KAN interpretability
Hybrid QKANs replace MLPs in large-scale models
🔎 Similar Papers
No similar papers found.
J
Jiun-Cheng Jiang
Department of Physics and Center for Theoretical Physics, National Taiwan University, Taipei 106319, Taiwan
Morris Yu-Chao Huang
Morris Yu-Chao Huang
UNC Chapel Hill
Machine LearningAI4Science(* denotes equal contribution)
Tianlong Chen
Tianlong Chen
Assistant Professor, CS@UNC Chapel Hill; Chief AI Scientist, hireEZ
Machine LearningAI4ScienceComputer VisionSparsity
H
Hsi-Sheng Goan
Department of Physics and Center for Theoretical Physics, National Taiwan University, Taipei 106319, Taiwan