HotSpot: Signed Distance Function Optimization with an Asymptotically Sufficient Condition

📅 2024-11-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing eikonal losses only provide a necessary condition for signed distance functions (SDFs), failing to guarantee solution validity and suffering from optimization instability; conventional surface-area regularization often distorts SDF geometry. To address these issues, we propose a novel loss function derived from the analytical solution of the screened Poisson equation, which establishes an asymptotically sufficient condition for SDF convergence while intrinsically suppressing surface distortion. Our method integrates neural implicit modeling with gradient-aware regularization. Evaluated on 2D and 3D shape reconstruction tasks, it achieves significantly improved distance field approximation accuracy and enhanced surface reconstruction robustness. Quantitative and qualitative comparisons demonstrate consistent superiority over the eikonal loss and multiple baseline approaches, including curvature- and Laplacian-based regularizers. The proposed formulation ensures stable optimization, preserves geometric fidelity, and generalizes effectively across diverse shape topologies and sampling regimes.

Technology Category

Application Category

📝 Abstract
We propose a method, HotSpot, for optimizing neural signed distance functions. Existing losses, such as the eikonal loss, act as necessary but insufficient constraints and cannot guarantee that the recovered implicit function represents a true distance function, even if the output minimizes these losses almost everywhere. Furthermore, the eikonal loss suffers from stability issues in optimization. Finally, in conventional methods, regularization losses that penalize surface area distort the reconstructed signed distance function. We address these challenges by designing a loss function using the solution of a screened Poisson equation. Our loss, when minimized, provides an asymptotically sufficient condition to ensure the output converges to a true distance function. Our loss also leads to stable optimization and naturally penalizes large surface areas. We present theoretical analysis and experiments on both challenging 2D and 3D datasets and show that our method provides better surface reconstruction and a more accurate distance approximation.
Problem

Research questions and friction points this paper is trying to address.

Optimizing neural signed distance functions accurately.
Addressing instability in eikonal loss optimization.
Preventing distortion in signed distance function reconstruction.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimizes neural signed distance functions effectively
Uses screened Poisson equation for loss function
Ensures stable optimization and accurate distance approximation
🔎 Similar Papers
No similar papers found.
Z
Zimo Wang
UC San Diego
C
Cheng Wang
UC San Diego
T
Taiki Yoshino
UC San Diego
S
Sirui Tao
UC San Diego
Z
Ziyang Fu
UC San Diego
Tzu-Mao Li
Tzu-Mao Li
UCSD
Computer GraphicsComputer VisionProgramming SystemsStatistical Learning