DiverseFlow: Sample-Efficient Diverse Mode Coverage in Flows

📅 2025-04-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the low sampling diversity and inefficient multimodal coverage of flow-based generative models, this paper proposes DiverseFlow—the first training-agnostic, plug-and-play framework for enhancing sample diversity in flow models. The core innovation lies in the first application of Determinantal Point Processes (DPPs) to sampling in the latent space of flow models: greedy DPP optimization explicitly models inter-sample diversity, augmented by a latent-space diversity regularization term. Crucially, DiverseFlow requires no model retraining and operates within fixed sampling budgets. Experiments demonstrate substantial improvements in mode coverage across diverse tasks—including polysemous text-to-image generation, large-region image inpainting, and class-conditional synthesis—achieving competitive or superior FID and diversity scores using only ≤30% of the baseline sampling budget.

Technology Category

Application Category

📝 Abstract
Many real-world applications of flow-based generative models desire a diverse set of samples that cover multiple modes of the target distribution. However, the predominant approach for obtaining diverse sets is not sample-efficient, as it involves independently obtaining many samples from the source distribution and mapping them through the flow until the desired mode coverage is achieved. As an alternative to repeated sampling, we introduce DiverseFlow: a training-free approach to improve the diversity of flow models. Our key idea is to employ a determinantal point process to induce a coupling between the samples that drives diversity under a fixed sampling budget. In essence, DiverseFlow allows exploration of more variations in a learned flow model with fewer samples. We demonstrate the efficacy of our method for tasks where sample-efficient diversity is desirable, such as text-guided image generation with polysemous words, inverse problems like large-hole inpainting, and class-conditional image synthesis.
Problem

Research questions and friction points this paper is trying to address.

Enhance sample diversity in flow models efficiently
Reduce sampling cost for diverse mode coverage
Improve flow model exploration with fewer samples
Innovation

Methods, ideas, or system contributions that make the work stand out.

Training-free approach using determinantal point process
Improves diversity under fixed sampling budget
Enhances flow model exploration with fewer samples
🔎 Similar Papers
No similar papers found.