🤖 AI Summary
This work extends generative compressed sensing theory from finite-dimensional spaces to infinite-dimensional Hilbert spaces, addressing the limitations of existing frameworks in modeling physical signals. By introducing a generalized restricted isometry property (RIP) and local coherence analysis, the authors establish a resolution-independent theoretical framework and derive an optimal sampling distribution. The theory demonstrates that the required number of measurements depends solely on the intrinsic dimensionality of the generative prior. Numerical experiments on the Darcy flow equation confirm that the proposed approach achieves stable and highly accurate reconstructions even under severe undersampling, while also revealing an implicit regularization effect induced by low-resolution generators.
📝 Abstract
Deep generative models have become a standard for modeling priors for inverse problems, going beyond classical sparsity-based methods. However, existing theoretical guarantees are mostly confined to finite-dimensional vector spaces, creating a gap when the physical signals are modeled as functions in Hilbert spaces. This work presents a rigorous framework for generative compressed sensing in Hilbert spaces. We extend the notion of local coherence in an infinite-dimensional setting, to derive optimal, resolution-independent sampling distributions. Thanks to a generalization of the Restricted Isometry Property, we show that stable recovery holds when the number of measurements is proportional to the prior's intrinsic dimension (up to logarithmic factors), independent of the ambient dimension. Finally, numerical experiments on the Darcy flow equation validate our theoretical findings and demonstrate that in severely undersampled regimes, employing lower-resolution generators acts as an implicit regularizer, improving reconstruction stability.