Dequantization and Color Transfer with Diffusion Models

📅 2023-07-06
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
Existing image recoloring and editing methods struggle to simultaneously achieve precise local color control, faithful texture preservation, and consistent color reproduction across luminance-varying regions. To address this, we propose a diffusion-based, quantized palette-driven editing framework. Our method uniquely employs quantized images as direct inputs to the diffusion model, enhancing interpretability and controllability. We design a weighted bipartite graph matching algorithm to enable semantically coherent, extreme palette transfer. Furthermore, we introduce multi-scale texture conditioning—optimized via thresholded gradient guidance—and JPEG-noise-robust training, overcoming the brightness-invariance limitation inherent in prior approaches. Extensive experiments demonstrate state-of-the-art performance: high-fidelity reconstruction, strict adherence to target palettes, and superior texture consistency. The framework supports both localized recoloring with fine-grained control and end-to-end palette migration, establishing a new benchmark for controllable, photorealistic image editing.
📝 Abstract
We demonstrate an image dequantizing diffusion model that enables novel edits on natural images. We propose operating on quantized images because they offer easy abstraction for patch-based edits and palette transfer. In particular, we show that color palettes can make the output of the diffusion model easier to control and interpret. We first establish that existing image restoration methods are not sufficient, such as JPEG noise reduction models. We then demonstrate that our model can generate natural images that respect the color palette the user asked for. For palette transfer, we propose a method based on weighted bipartite matching. We then show that our model generates plausible images even after extreme palette transfers, respecting user query. Our method can optionally condition on the source texture in part or all of the image. In doing so, we overcome a common problem in existing image colorization methods that are unable to produce colors with a different luminance than the input. We evaluate several possibilities for texture conditioning and their trade-offs, including luminance, image gradients, and thresholded gradients, the latter of which performed best in maintaining texture and color control simultaneously. Our method can be usefully extended to another practical edit: recoloring patches of an image while respecting the source texture. Our procedure is supported by several qualitative and quantitative evaluations.
Problem

Research questions and friction points this paper is trying to address.

Image Coloring
Local Color Modification
Texture-aware Editing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Diffusion Model
Color Manipulation
Texture-aware Image Editing
🔎 Similar Papers
2024-09-16Philosophical transactions. Series A, Mathematical, physical, and engineering sciencesCitations: 8