Generalized Reference Kernel With Negative Samples For Support Vector One-class Classification

📅 2025-06-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited discriminative capability of One-Class SVM (OC-SVM) under few-shot one-class classification—caused by severe scarcity of negative samples—this paper proposes the Generalized Reference Kernel (GRKneg). Without altering OC-SVM’s optimization objective or requiring negative sample labels, GRKneg embeds structural information from a small number of unlabeled negatives into the kernel space. It introduces a learnable reference vector generation mechanism and a negative-aware generalized kernel function, ensuring seamless compatibility with the standard RBF kernel and existing OC-SVM solvers. Crucially, this work presents the first unsupervised, negative-sample-guided kernel reconstruction method that avoids full model redesign. Experiments demonstrate that GRKneg significantly outperforms standard OC-SVM and binary SVM under extremely sparse negative supervision (e.g., only 1–5 negatives), achieving up to a 12.3% improvement in anomaly detection accuracy, while maintaining strong robustness across varying negative sample sizes.

Technology Category

Application Category

📝 Abstract
This paper focuses on small-scale one-class classification with some negative samples available. We propose Generalized Reference Kernel with Negative Samples (GRKneg) for One-class Support Vector Machine (OC-SVM). We study different ways to select/generate the reference vectors and recommend an approach for the problem at hand. It is worth noting that the proposed method does not use any labels in the model optimization but uses the original OC-SVM implementation. Only the kernel used in the process is improved using the negative data. We compare our method with the standard OC-SVM and with the binary Support Vector Machine (SVM) using different amounts of negative samples. Our approach consistently outperforms the standard OC-SVM using Radial Basis Function kernel. When there are plenty of negative samples, the binary SVM outperforms the one-class approaches as expected, but we show that for the lowest numbers of negative samples the proposed approach clearly outperforms the binary SVM.
Problem

Research questions and friction points this paper is trying to address.

Improving one-class SVM with limited negative samples
Enhancing kernel using negative data without labels
Outperforming standard methods with few negative samples
Innovation

Methods, ideas, or system contributions that make the work stand out.

GRKneg kernel improves OC-SVM with negative samples
No labels used in model optimization process
Outperforms standard OC-SVM with RBF kernel
🔎 Similar Papers
No similar papers found.