Jo-SNC: Combating Noisy Labels Through Fostering Self- and Neighbor-Consistency.

๐Ÿ“… 2025-12-22
๐Ÿ›๏ธ IEEE Transactions on Pattern Analysis and Machine Intelligence
๐Ÿ“ˆ Citations: 1
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the performance degradation in supervised deep learning caused by label noise, encompassing both in-distribution and out-of-distribution noise. To tackle this challenge, the authors propose Jo-SNC, a method that integrates sample selection with model regularization. Jo-SNC identifies clean samples by evaluating prediction consistency between each sample and its nearest neighbors, and employs partial label learning and negative learning strategies to handle different noise types. The approach innovatively introduces an adaptive threshold mechanism based on Jensenโ€“Shannon divergence and designs a triplet consistency regularization to enhance both prediction and feature stability. Extensive experiments on multiple benchmark datasets demonstrate that Jo-SNC significantly outperforms current state-of-the-art methods, confirming its effectiveness and robustness.

Technology Category

Application Category

๐Ÿ“ Abstract
Label noise is pervasive in various real-world scenarios, posing challenges in supervised deep learning. Deep networks are vulnerable to such label-corrupted samples due to the memorization effect. One major stream of previous methods concentrates on identifying clean data for training. However, these methods often neglect imbalances in label noise across different mini-batches and devote insufficient attention to out-of-distribution noisy data. To this end, we propose a noise-robust method named Jo-SNC (Joint sample selection and model regularization based on Self- and Neighbor-Consistency). Specifically, we propose to employ the Jensen-Shannon divergence to measure the "likelihood" of a sample being clean or out-of-distribution. This process factors in the nearest neighbors of each sample to reinforce the reliability of clean sample identification. We design a self-adaptive, data-driven thresholding scheme to adjust per-class selection thresholds. While clean samples undergo conventional training, detected in-distribution and out-of-distribution noisy samples are trained following partial label learning and negative learning, respectively. Finally, we advance the model performance further by proposing a triplet consistency regularization that promotes self-prediction consistency, neighbor-prediction consistency, and feature consistency. Extensive experiments on various benchmark datasets and comprehensive ablation studies demonstrate the effectiveness and superiority of our approach over existing state-of-the-art methods. Our code and models have been made publicly available at https://github.com/NUST-Machine-Intelligence-Laboratory/Jo-SNC.
Problem

Research questions and friction points this paper is trying to address.

label noise
noisy labels
supervised deep learning
out-of-distribution
class imbalance
Innovation

Methods, ideas, or system contributions that make the work stand out.

label noise
self-consistency
neighbor-consistency
Jensen-Shannon divergence
triplet consistency regularization
๐Ÿ”Ž Similar Papers
No similar papers found.
Zeren Sun
Zeren Sun
Associate Professor, Nanjing University of Science and Technology
computer visiondeep learningfine-grained visual recognitionlearning from label noise
Y
Yazhou Yao
School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing 210094, China; State Key Laboratory of Intelligent Manufacturing of Advanced Construction Machinery, Nanjing 210094, China
Tongliang Liu
Tongliang Liu
Director, Sydney AI Centre, University of Sydney & Mohamed bin Zayed University of AI
Machine LearningLearning with Noisy LabelsTrustworthy Machine Learning
Z
Zechao Li
School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
F
Fumin Shen
School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, China
J
Jinhui Tang
School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing 210094, China