Safe Screening Rules for Group SLOPE

📅 2025-06-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional sparse learning with group-structured variables, existing Group SLOPE methods suffer from block inseparability, leading to failure of safe screening and incurring prohibitive computational and memory overhead. Method: This paper introduces the first safe group-level screening rule for Group SLOPE, integrating dual optimization with geometric boundary analysis to derive a theoretically guaranteed, plug-and-play screening criterion compatible with both batch and stochastic optimizers. Contribution/Results: The proposed rule rigorously identifies and discards entire groups of zero-coefficient features without sacrificing solution accuracy. Empirical evaluation demonstrates speedups of 3–8× and over 50% reduction in memory consumption, significantly enhancing scalability and practicality for large-scale group-sparse learning.

Technology Category

Application Category

📝 Abstract
Variable selection is a challenging problem in high-dimensional sparse learning, especially when group structures exist. Group SLOPE performs well for the adaptive selection of groups of predictors. However, the block non-separable group effects in Group SLOPE make existing methods either invalid or inefficient. Consequently, Group SLOPE tends to incur significant computational costs and memory usage in practical high-dimensional scenarios. To overcome this issue, we introduce a safe screening rule tailored for the Group SLOPE model, which efficiently identifies inactive groups with zero coefficients by addressing the block non-separable group effects. By excluding these inactive groups during training, we achieve considerable gains in computational efficiency and memory usage. Importantly, the proposed screening rule can be seamlessly integrated into existing solvers for both batch and stochastic algorithms. Theoretically, we establish that our screening rule can be safely employed with existing optimization algorithms, ensuring the same results as the original approaches. Experimental results confirm that our method effectively detects inactive feature groups and significantly boosts computational efficiency without compromising accuracy.
Problem

Research questions and friction points this paper is trying to address.

Addresses block non-separable group effects in Group SLOPE
Reduces computational costs and memory usage in high-dimensional scenarios
Ensures safe integration with existing solvers without accuracy loss
Innovation

Methods, ideas, or system contributions that make the work stand out.

Safe screening rule for Group SLOPE
Efficiently identifies inactive groups
Boosts computational efficiency significantly
🔎 Similar Papers
No similar papers found.
Runxue Bao
Runxue Bao
AI Scientist, GE Healthcare
LLMsNLPDeep LearningOptimizationTransfer Learning
Q
Quanchao Lu
Georgia Institute of Technology, Atlanta, GA 30332, United States
Yanfu Zhang
Yanfu Zhang
William&Mary