🤖 AI Summary
Existing NeRF and 3D Gaussian Splatting (3DGS) methods struggle to efficiently model specular objects, leading to reflection distortions, slow rendering, and inconsistent material appearance in novel-view synthesis. This paper introduces the first pixel-level material-aware and multi-order interreflection modeling framework within the 3DGS paradigm. We propose a physically grounded deferred rendering pipeline: jointly optimizing geometry and material via material-driven normal propagation and per-Gaussian initial shading; approximating interreflections using Gaussian kernels and accelerating computation with 2D Gaussian primitives. Our method achieves significant improvements on standard benchmarks (Synthetic, Real-world), boosting PSNR (+1.8 dB), SSIM (+0.025), and reducing LPIPS (−0.032). It enables real-time rendering (≥30 FPS), relighting, and interactive editing—establishing the first unified, efficient, and editable 3DGS solution for reflective scene reconstruction and novel-view synthesis.
📝 Abstract
Novel view synthesis has experienced significant advancements owing to increasingly capable NeRF- and 3DGS-based methods. However, reflective object reconstruction remains challenging, lacking a proper solution to achieve real-time, high-quality rendering while accommodating inter-reflection. To fill this gap, we introduce a Reflective Gaussian splatting ( extbf{Ref-Gaussian}) framework characterized with two components: (I) {em Physically based deferred rendering} that empowers the rendering equation with pixel-level material properties via formulating split-sum approximation; (II) {em Gaussian-grounded inter-reflection} that realizes the desired inter-reflection function within a Gaussian splatting paradigm for the first time. To enhance geometry modeling, we further introduce material-aware normal propagation and an initial per-Gaussian shading stage, along with 2D Gaussian primitives. Extensive experiments on standard datasets demonstrate that Ref-Gaussian surpasses existing approaches in terms of quantitative metrics, visual quality, and compute efficiency. Further, we show that our method serves as a unified solution for both reflective and non-reflective scenes, going beyond the previous alternatives focusing on only reflective scenes. Also, we illustrate that Ref-Gaussian supports more applications such as relighting and editing.