🤖 AI Summary
This work addresses the efficient randomized estimation of the matrix $ell_2 oell_infty$ and $ell_1 oell_2$ operator norms in the matrix-free setting—where only matrix-vector products are accessible. We propose novel stochastic algorithms built upon the Hutchinson and Hutch++ frameworks, incorporating Jacobian-based regularization and random projection techniques to enable end-to-end differentiable optimization. Our approach establishes the first tight oracle complexity bounds for these asymmetric operator norms under matrix-free access. Theoretically, it provides rigorous guarantees on estimation accuracy and query complexity; practically, it enables scalable deployment without explicit matrix storage. Experiments demonstrate that norm-based regularization significantly improves model robustness in image classification and effectively mitigates gradient-based adversarial attacks in recommendation systems—reducing AUC degradation by up to 42%. To our knowledge, this is the first systematic solution to matrix-free estimation of such operator norms, bridging theoretical soundness with real-world applicability.
📝 Abstract
In this paper, we propose new randomized algorithms for estimating the two-to-infinity and one-to-two norms in a matrix-free setting, using only matrix-vector multiplications. Our methods are based on appropriate modifications of Hutchinson's diagonal estimator and its Hutch++ version. We provide oracle complexity bounds for both modifications. We further illustrate the practical utility of our algorithms for Jacobian-based regularization in deep neural network training on image classification tasks. We also demonstrate that our methodology can be applied to mitigate the effect of adversarial attacks in the domain of recommender systems.