GK-SMOTE: A Hyperparameter-free Noise-Resilient Gaussian KDE-Based Oversampling Approach

📅 2025-09-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the degradation of model performance in imbalanced classification caused by label noise and complex class distributions, this paper proposes a hyperparameter-free, noise-robust density-aware oversampling method. The approach employs Gaussian kernel density estimation (KDE) to adaptively identify high-density “safe” regions and low-density “noisy” or ambiguous regions; synthetic samples are generated exclusively within safe regions. It further integrates a boundary-aware identification strategy into an enhanced SMOTE framework. Its core innovation lies in a density-driven regional discrimination mechanism that inherently avoids noise contamination, thereby significantly improving class separability and model robustness. Extensive experiments on multiple binary-class benchmark datasets demonstrate that the proposed method consistently outperforms state-of-the-art oversampling techniques across key metrics—including Matthews Correlation Coefficient (MCC), balanced accuracy, and Area Under the Precision-Recall Curve (AUPRC)—particularly under realistic noisy conditions.

Technology Category

Application Category

📝 Abstract
Imbalanced classification is a significant challenge in machine learning, especially in critical applications like medical diagnosis, fraud detection, and cybersecurity. Traditional oversampling techniques, such as SMOTE, often fail to handle label noise and complex data distributions, leading to reduced classification accuracy. In this paper, we propose GK-SMOTE, a hyperparameter-free, noise-resilient extension of SMOTE, built on Gaussian Kernel Density Estimation (KDE). GK-SMOTE enhances class separability by generating synthetic samples in high-density minority regions, while effectively avoiding noisy or ambiguous areas. This self-adaptive approach uses Gaussian KDE to differentiate between safe and noisy regions, ensuring more accurate sample generation without requiring extensive parameter tuning. Our extensive experiments on diverse binary classification datasets demonstrate that GK-SMOTE outperforms existing state-of-the-art oversampling techniques across key evaluation metrics, including MCC, Balanced Accuracy, and AUPRC. The proposed method offers a robust, efficient solution for imbalanced classification tasks, especially in noisy data environments, making it an attractive choice for real-world applications.
Problem

Research questions and friction points this paper is trying to address.

Addresses imbalanced classification challenges in machine learning
Handles label noise and complex data distributions effectively
Improves classification accuracy without requiring hyperparameter tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hyperparameter-free Gaussian KDE oversampling
Noise-resilient synthetic sample generation
Self-adaptive density-based minority class enhancement
🔎 Similar Papers
No similar papers found.
M
Mahabubur Rahman Miraj
College of Computer Science, Chongqing University, Chongqing 400044, China
Hongyu Huang
Hongyu Huang
Associate Professor of Computer Science, Chongqing University
Vehicular Ad Hoc NetworksWireless Sensor Networks
T
Ting Yang
College of Computer Science, Chongqing University, Chongqing 400044, China
J
Jinxue Zhao
College of Computer Science, Chongqing University, Chongqing 400044, China
Nankun Mu
Nankun Mu
College of Computer Science,Chongqing University
Xinyu Lei
Xinyu Lei
Michigan Technological University
Mobile ComputingData SciencePrivacy-Preserving Protocols