Structural-Entropy-Based Sample Selection for Efficient and Effective Learning

📅 2024-10-03
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing sample selection methods predominantly rely on local training difficulty while neglecting the global connectivity of graph structures, resulting in insufficient sample representativeness. To address this, we propose a sample selection framework that jointly leverages global and local information: (i) a k-nearest-neighbor (kNN) graph is constructed; (ii) structural entropy—a metric quantifying node-level global connectivity—is first integrated with Shapley value decomposition to losslessly encode graph topology; and (iii) training difficulty is coupled with importance-biased blue-noise sampling to jointly optimize sample diversity and representativeness. The method consistently improves model performance across three learning paradigms—supervised learning, active learning, and continual learning—achieving average accuracy gains of 3.2%–5.7% under identical labeling budgets, alongside accelerated convergence.

Technology Category

Application Category

📝 Abstract
Sample selection improves the efficiency and effectiveness of machine learning models by providing informative and representative samples. Typically, samples can be modeled as a sample graph, where nodes are samples and edges represent their similarities. Most existing methods are based on local information, such as the training difficulty of samples, thereby overlooking global information, such as connectivity patterns. This oversight can result in suboptimal selection because global information is crucial for ensuring that the selected samples well represent the structural properties of the graph. To address this issue, we employ structural entropy to quantify global information and losslessly decompose it from the whole graph to individual nodes using the Shapley value. Based on the decomposition, we present $ extbf{S}$tructural-$ extbf{E}$ntropy-based sample $ extbf{S}$election ($ extbf{SES}$), a method that integrates both global and local information to select informative and representative samples. SES begins by constructing a $k$NN-graph among samples based on their similarities. It then measures sample importance by combining structural entropy (global metric) with training difficulty (local metric). Finally, SES applies importance-biased blue noise sampling to select a set of diverse and representative samples. Comprehensive experiments on three learning scenarios -- supervised learning, active learning, and continual learning -- clearly demonstrate the effectiveness of our method.
Problem

Research questions and friction points this paper is trying to address.

Improves sample selection using global and local information
Quantifies global information with structural entropy and Shapley value
Enhances efficiency and effectiveness in machine learning models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses structural entropy for global information quantification
Integrates Shapley value for lossless graph decomposition
Combines global and local metrics for sample selection
🔎 Similar Papers
No similar papers found.
T
Tianchi Xie
Tsinghua University
J
Jiangning Zhu
Tsinghua University
G
Guozu Ma
China Telecom Wanwei Information Technology Co., Ltd
M
Minzhi Lin
Tsinghua University
W
Wei Chen
Microsoft Research
Weikai Yang
Weikai Yang
Assistant Professor, Hong Kong University of Science and Technology (Guangzhou)
Visual AnalyticsInteractive Machine LearningData-Centric AIText Visualization
Shixia Liu
Shixia Liu
Professor, Tsinghua University, IEEE Fellow
interactive machine learningData-Centric AIvisual analytics