A Gradient Flow Approach to Solving Inverse Problems with Latent Diffusion Models

📅 2025-09-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Solving ill-posed inverse problems faces key bottlenecks: difficulty in prior modeling and reliance on fine-tuning generative models. To address these, this paper proposes a training-free method—Latent-Space Wasserstein Gradient Flow (DWGF)—that leverages pre-trained latent diffusion models (e.g., Stable Diffusion) as fixed, expressive priors. DWGF performs posterior sampling by minimizing the KL divergence between the posterior and the prior via Wasserstein gradient flow directly in the latent space, ensuring efficiency and stability without model adaptation. This work is the first to embed latent diffusion models into a gradient flow framework, eliminating the need for fine-tuning or auxiliary training while preserving strong prior expressivity and computational efficiency. Extensive experiments on multiple image inverse problem benchmarks demonstrate that DWGF significantly outperforms existing unsupervised and fine-tuned methods in both reconstruction quality (PSNR/SSIM) and convergence speed, validating its generalizability and practical utility.

Technology Category

Application Category

📝 Abstract
Solving ill-posed inverse problems requires powerful and flexible priors. We propose leveraging pretrained latent diffusion models for this task through a new training-free approach, termed Diffusion-regularized Wasserstein Gradient Flow (DWGF). Specifically, we formulate the posterior sampling problem as a regularized Wasserstein gradient flow of the Kullback-Leibler divergence in the latent space. We demonstrate the performance of our method on standard benchmarks using StableDiffusion (Rombach et al., 2022) as the prior.
Problem

Research questions and friction points this paper is trying to address.

Solving ill-posed inverse problems using latent diffusion models
Developing training-free posterior sampling via Wasserstein gradient flow
Regularizing KL divergence in latent space for better priors
Innovation

Methods, ideas, or system contributions that make the work stand out.

Training-free approach using latent diffusion models
Regularized Wasserstein gradient flow formulation
Posterior sampling in latent space divergence
🔎 Similar Papers
No similar papers found.
Tim Y. J. Wang
Tim Y. J. Wang
Imperial College London
machine learning
O
O. D. Akyildiz
Department of Mathematics, Imperial College London