🤖 AI Summary
This work proposes a novel regression framework based on the m-product of tensors to address the curse of dimensionality and overfitting arising from data scarcity in high-dimensional nonlinear regression. By integrating kernel methods with tensor algebra, the approach constructs an efficient regularization mechanism through structured variable separation, circumventing conventional fixed-point iterations while preserving favorable properties of matrix computations. Experimental results on standard benchmarks and dynamical system tasks demonstrate that the method robustly and efficiently handles regression with hundreds of parameters using only small sample sizes, exhibiting strong scalability and practical utility for real-world engineering applications.
📝 Abstract
We present a nonlinear regression framework based on tensor algebra tailored to high dimensional contexts where data is scarce. We exploit algebraic properties of a partial tensor product, namely the m-tensor product, to leverage structured equations with separated variables. The proposed method combines kernel properties along with tensor algebra to prevent the curse of dimensionality and tackle approximations up to hundreds of parameters while avoiding the fixed point strategy. This formalism allows us to provide different regularization techniques fit for low amount of data with a high number of parameters while preserving well-known matrix-based properties. We demonstrate complexity scaling on a general benchmark and dynamical systems to show robustness for engineering problems and ease of implementation.