FAST-DIPS: Adjoint-Free Analytic Steps and Hard-Constrained Likelihood Correction for Diffusion-Prior Inverse Problems

📅 2026-03-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the computational inefficiency of existing methods for solving inverse problems with diffusion priors under nonlinear forward operators, which often rely on expensive repeated derivative evaluations or inner-loop optimization/MCMC sampling. The authors propose a training-free solver that replaces inner loops with hard constraint projections in measurement space and analytically derived optimal step sizes, yielding fixed and low per-noise-level computational cost. The key innovation lies in the first joint optimization framework that is adjoint-free, combining analytical step sizes with hard constraints. This approach integrates ADMM splitting, reannealing, and a hybrid latent/pixel-space scheduling strategy to guarantee local optimality and descent properties, while also enabling a derived KL error bound. Experiments demonstrate state-of-the-art PSNR, SSIM, and LPIPS performance in image reconstruction, achieving up to 19.5× acceleration without requiring hand-coded adjoints or MCMC sampling.

Technology Category

Application Category

📝 Abstract
Training-free diffusion priors enable inverse-problem solvers without retraining, but for nonlinear forward operators data consistency often relies on repeated derivatives or inner optimization/MCMC loops with conservative step sizes, incurring many iterations and denoiser/score evaluations. We propose a training-free solver that replaces these inner loops with a hard measurement-space feasibility constraint (closed-form projection) and an analytic, model-optimal step size, enabling a small, fixed compute budget per noise level. Anchored at the denoiser prediction, the correction is approximated via an adjoint-free, ADMM-style splitting with projection and a few steepest-descent updates, using one VJP and either one JVP or a forward-difference probe, followed by backtracking and decoupled re-annealing. We prove local model optimality and descent under backtracking for the step-size rule, and derive an explicit KL bound for mode-substitution re-annealing under a local Gaussian conditional surrogate. We also develop a latent variant and a one-parameter pixel$\rightarrow$latent hybrid schedule. Experiments achieve competitive PSNR/SSIM/LPIPS with up to 19.5$\times$ speedup, without hand-coded adjoints or inner MCMC.
Problem

Research questions and friction points this paper is trying to address.

diffusion-prior inverse problems
nonlinear forward operators
data consistency
inner optimization loops
computational efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

adjoint-free
hard-constrained projection
analytic step size
diffusion prior
training-free inverse solver
🔎 Similar Papers
No similar papers found.
Minwoo Kim
Minwoo Kim
Department of Statistics, Seoul National University
differential privacymultivariate statistics
S
Seunghyeok Shin
Department of Electrical and Computer Engineering, Inha University
H
Hongki Lim
Department of Electrical and Computer Engineering, Inha University