Class-Balance Bias in Regularized Regression

📅 2025-01-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work identifies and formalizes a systematic coefficient bias in regularized regression (Lasso, Ridge, Elastic Net) arising from class imbalance in binary features—a bias amplified by the coupling of feature normalization and regularization strategies, and exacerbated in mixed-type data and interaction-term settings. To address it, we propose two novel correction paradigms: (i) variance- or standard-deviation-based scaling of binary features for Lasso and Ridge; and (ii) class-aware reweighting of penalty terms for Elastic Net. Through theoretical analysis, extensive simulations, and bias–variance trade-off quantification, we demonstrate the ubiquity and correctability of the bias. We further provide practical, feature-type-aware normalization guidelines for mixed-data settings and explicitly quantify the variance increase induced by correction. Our framework establishes foundational methodological support for fairness and stability in high-dimensional sparse modeling.

Technology Category

Application Category

📝 Abstract
Regularized models are often sensitive to the scales of the features in the data and it has therefore become standard practice to normalize (center and scale) the features before fitting the model. But there are many different ways to normalize the features and the choice may have dramatic effects on the resulting model. In spite of this, there has so far been no research on this topic. In this paper, we begin to bridge this knowledge gap by studying normalization in the context of lasso, ridge, and elastic net regression. We focus on normal and binary features and show that the class balances of binary features directly influences the regression coefficients and that this effect depends on the combination of normalization and regularization methods used. We demonstrate that this effect can be mitigated by scaling binary features with their variance in the case of the lasso and standard deviation in the case of ridge regression, but that this comes at the cost of increased variance. For the elastic net, we show that scaling the penalty weights, rather than the features, can achieve the same effect. Finally, we also tackle mixes of binary and normal features as well as interactions and provide some initial results on how to normalize features in these cases.
Problem

Research questions and friction points this paper is trying to address.

Regularized Regression
Class Imbalance
Data Adjustment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Class Balance Bias Reduction
Regularized Regression Models
Data Scaling Techniques
🔎 Similar Papers
No similar papers found.
J
Johan Larsson
Department of Mathematical Sciences, University of Copenhagen
Jonas Wallin
Jonas Wallin
Lund University
statistics