Estimating condition number with Graph Neural Networks

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Accurate estimation of the condition number of sparse matrices is computationally expensive. This work proposes an efficient prediction method based on graph neural networks (GNNs) that enables rapid modeling of both the 1-norm and 2-norm condition numbers. By designing a feature engineering pipeline with linear complexity—O(nnz + n), where nnz denotes the number of nonzeros and n the matrix dimension—the approach supports two distinct prediction strategies. It achieves high accuracy while significantly outperforming classical methods such as the Hager–Higham and Lanczos algorithms. Extensive evaluation on multiple benchmark datasets demonstrates speedups of up to an order of magnitude, offering a scalable solution for condition number estimation in large-scale sparse systems.

Technology Category

Application Category

📝 Abstract
In this paper, we propose a fast method for estimating the condition number of sparse matrices using graph neural networks (GNNs). To enable efficient training and inference of GNNs, our proposed feature engineering for GNNs achieves $\mathrm{O}(\mathrm{nnz} + n)$, where $\mathrm{nnz}$ is the number of non-zero elements in the matrix and $n$ denotes the matrix dimension. We propose two prediction schemes for estimating the matrix condition number using GNNs. The extensive experiments for the two schemes are conducted for 1-norm and 2-norm condition number estimation, which show that our method achieves a significant speedup over the Hager-Higham and Lanczos methods.
Problem

Research questions and friction points this paper is trying to address.

condition number
sparse matrices
graph neural networks
matrix estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph Neural Networks
Condition Number Estimation
Sparse Matrices
Feature Engineering
Computational Efficiency
🔎 Similar Papers
No similar papers found.