Combined Image Data Augmentations diminish the benefits of Adaptive Label Smoothing

📅 2025-07-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Adaptive label smoothing (ALS) is widely adopted for regularization, yet its efficacy under diverse image augmentations—such as random erasing, noise injection, and TrivialAugment—remains poorly understood, particularly regarding robustness to common corruptions. Method: We extend ALS to operate across multiple heterogeneous augmentation operators and systematically evaluate its regularization behavior and robustness under both single-strong and composite-augmentation regimes. Results: ALS significantly improves generalization under strong single augmentations (e.g., aggressive random erasing), but its benefits vanish—or even become detrimental—under combined augmentations. Moreover, excessive smoothing degrades model robustness to noise, blur, and other common corruptions. This work identifies, for the first time, the failure mechanism of ALS in heterogeneous augmentation settings and establishes a critical balance condition between augmentation strength and diversity for effective ALS deployment. Our findings provide both theoretical insight and practical guidelines for applying ALS in realistic training scenarios.

Technology Category

Application Category

📝 Abstract
Soft augmentation regularizes the supervised learning process of image classifiers by reducing label confidence of a training sample based on the magnitude of random-crop augmentation applied to it. This paper extends this adaptive label smoothing framework to other types of aggressive augmentations beyond random-crop. Specifically, we demonstrate the effectiveness of the method for random erasing and noise injection data augmentation. Adaptive label smoothing permits stronger regularization via higher-intensity Random Erasing. However, its benefits vanish when applied with a diverse range of image transformations as in the state-of-the-art TrivialAugment method, and excessive label smoothing harms robustness to common corruptions. Our findings suggest that adaptive label smoothing should only be applied when the training data distribution is dominated by a limited, homogeneous set of image transformation types.
Problem

Research questions and friction points this paper is trying to address.

Extends adaptive label smoothing to aggressive image augmentations
Shows benefits diminish with diverse transformations like TrivialAugment
Excessive label smoothing harms robustness to image corruptions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends adaptive label smoothing to aggressive augmentations
Demonstrates effectiveness with random erasing and noise injection
Recommends limited homogeneous transformations for best results
🔎 Similar Papers
No similar papers found.