🤖 AI Summary
This work addresses the structure–texture image decomposition problem. We propose LPR-NET, the first fully parameter-free and computationally efficient learnable method for this task. Grounded in the low-patch-rank (LPR) prior, LPR-NET unrolls an optimization process into a sequence of trainable neural network layers, enabling end-to-end supervised learning. Unlike conventional iterative algorithms, it eliminates reliance on handcrafted priors and manually tuned hyperparameters. Trained solely on synthetic data, LPR-NET generalizes effectively to natural images without adaptation. Experiments demonstrate that LPR-NET achieves decomposition quality competitive with classical model-based methods, while offering significantly faster inference—orders of magnitude quicker than iterative solvers. Crucially, it requires zero human intervention for parameter tuning. To our knowledge, LPR-NET is the first approach to simultaneously achieve full parameter freedom and computational efficiency in structure–texture decomposition.
📝 Abstract
In this work, we propose a parameter-free and efficient method to tackle the structure-texture image decomposition problem. In particular, we present a neural network LPR-NET based on the unrolling of the Low Patch Rank model. On the one hand, this allows us to automatically learn parameters from data, and on the other hand to be computationally faster while obtaining qualitatively similar results compared to traditional iterative model-based methods. Moreover, despite being trained on synthetic images, numerical experiments show the ability of our network to generalize well when applied to natural images.