🤖 AI Summary
To address severe pseudo-label noise and unreliable graph structures—leading to distorted supervision signals in multi-label learning—this paper proposes the first feature selection method integrating binary hashing learning with dynamic graph constraints. Methodologically, it (1) replaces continuous pseudo-labels with compact binary codes to significantly suppress label noise; (2) constructs a graph-structured dynamic projection space, jointly optimizing pseudo-label quality via label-graph regularization; and (3) achieves sparse, interpretable feature selection through coupled minimization of inner products and the ℓ₂,₁-norm. The optimization is solved via the augmented Lagrangian multiplier method, unifying binary hashing, dynamic graph modeling, and multi-regularizer coupling. Extensive experiments on 10 benchmark datasets across 6 evaluation metrics demonstrate consistent superiority over 10 state-of-the-art methods, achieving an average rank over 2.7 positions ahead of the second-best approach—validating both robustness and effectiveness.
📝 Abstract
Multi-label learning poses significant challenges in extracting reliable supervisory signals from the label space. Existing approaches often employ continuous pseudo-labels to replace binary labels, improving supervisory information representation. However, these methods can introduce noise from irrelevant labels and lead to unreliable graph structures. To overcome these limitations, this study introduces a novel multi-label feature selection method called Binary Hashing and Dynamic Graph Constraint (BHDG), the first method to integrate binary hashing into multi-label learning. BHDG utilizes low-dimensional binary hashing codes as pseudo-labels to reduce noise and improve representation robustness. A dynamically constrained sample projection space is constructed based on the graph structure of these binary pseudo-labels, enhancing the reliability of the dynamic graph. To further enhance pseudo-label quality, BHDG incorporates label graph constraints and inner product minimization within the sample space. Additionally, an $l_{2,1}$-norm regularization term is added to the objective function to facilitate the feature selection process. The augmented Lagrangian multiplier (ALM) method is employed to optimize binary variables effectively. Comprehensive experiments on 10 benchmark datasets demonstrate that BHDG outperforms ten state-of-the-art methods across six evaluation metrics. BHDG achieves the highest overall performance ranking, surpassing the next-best method by an average of at least 2.7 ranks per metric, underscoring its effectiveness and robustness in multi-label feature selection.