🤖 AI Summary
This paper addresses parameter estimation and statistical inference for generalized linear models (GLMs) with linear constraints on regression coefficients. We propose the Constrained Iteratively Reweighted Least Squares (CIRLS) algorithm, which enforces feasibility at each iteration via embedded quadratic programming and enables exact inference for constrained coefficients through Monte Carlo simulation from a truncated multivariate normal distribution. To support covariance estimation and empirical confidence interval construction, we introduce an effective degrees-of-freedom measure that accounts for constraint tightness. Compared to unconstrained estimation, CIRLS substantially reduces estimation variance and improves predictive accuracy—particularly in high-dimensional settings with strong collinearity. Simulation and empirical studies demonstrate that CIRLS yields more robust variance estimates and confidence intervals with coverage probabilities closer to nominal levels. The method combines theoretical rigor with practical scalability, offering a principled and computationally tractable framework for constrained GLM inference.
📝 Abstract
We propose a simple and flexible framework for generalised linear models (GLM) with linear constraints on the coefficients. Linear constraints are useful in a wide range of applications, allowing the fitting of model with high-dimensional or highly collinear predictors, as well as encoding assumptions on the association between some or all predictors and the response. We propose the constrained iteratively-reweighted least squares (CIRLS) to fit the model, iterating quadratic programs to ensure the coefficient vector remains feasible according to the constraints. Inference for constrained coefficients can be obtained by simulating from a truncated multivariate normal distribution and computing empirical confidence intervals or variance-covariance matrix from the simulated coefficient vectors. We additionally discuss the complexity of a constrained GLM, proposing a measure of expected degrees of freedom which accounts for the stringency of constraints in the reduction of the model degrees of freedom. An extensive simulations study shows that constraining the coefficients introduces some bias to the estimation, but also decreases the estimator variance. This trade-off results in an improved estimator when constraints are chosen appropriately. The simulations also show that our proposed inference results in error in variance estimation and coverage. The proposed framework is illustrated on two case studies, showing its usefulness as well as some of its weaknesses.