DiVeQ: Differentiable Vector Quantization Using the Reparameterization Trick

📅 2025-09-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Vector quantization (VQ) suffers from non-differentiability due to hard assignment, hindering end-to-end training. This paper proposes DiVeQ, a differentiable VQ method that preserves discrete hard quantization in the forward pass while enabling gradient flow via reparameterization of the continuous reconstruction error in the backward pass. To further improve codebook utilization and quantization accuracy, we introduce SF-DiVeQ, a space-filling variant that optimizes codeword geometry using space-filling curve-based placement. Unlike prior approaches, DiVeQ requires no auxiliary losses, temperature annealing, or Gumbel-Softmax relaxation for stable training. Extensive experiments on VQ-VAE-based image compression and VQGAN-based image generation demonstrate consistent improvements: higher PSNR and MS-SSIM for reconstruction quality, and lower FID and LPIPS scores for generative fidelity—validating DiVeQ’s effectiveness and generalizability across diverse VQ-based architectures.

Technology Category

Application Category

📝 Abstract
Vector quantization is common in deep models, yet its hard assignments block gradients and hinder end-to-end training. We propose DiVeQ, which treats quantization as adding an error vector that mimics the quantization distortion, keeping the forward pass hard while letting gradients flow. We also present a space-filling variant (SF-DiVeQ) that assigns to a curve constructed by the lines connecting codewords, resulting in less quantization error and full codebook usage. Both methods train end-to-end without requiring auxiliary losses or temperature schedules. On VQ-VAE compression and VQGAN generation across various data sets, they improve reconstruction and sample quality over alternative quantization approaches.
Problem

Research questions and friction points this paper is trying to address.

Enabling differentiable vector quantization for end-to-end training
Reducing quantization error through space-filling codeword assignments
Improving reconstruction and generation quality across diverse datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Differentiable vector quantization using reparameterization trick
Space-filling variant reduces quantization error
End-to-end training without auxiliary losses
🔎 Similar Papers
No similar papers found.