3DOF+Quantization: 3DGS quantization for large scenes with limited Degrees of Freedom

📅 2025-09-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the significant projection errors and limited reconstruction accuracy of 3D Gaussian Splatting (3DGS) in large-scale scenes under camera motion constrained to 3 degrees-of-freedom (3DoF) plus minor rotations (i.e., translation + small rotational deviations), this paper proposes a spherical-coordinate-based quantization method. Specifically, Gaussian centroid positions are mapped into spherical coordinates, enabling explicit modeling of the nonlinear relationship between projection error and radial distance; quantization parameters are then jointly optimized via rate-distortion optimization. This approach effectively suppresses the impact of positional quantization errors on rendering quality while maintaining low bitrates. Experiments on large-scale scenes—including Garden—demonstrate that, at identical bitrates, the proposed method achieves an average PSNR gain of 1.8 dB over baseline methods, striking a superior trade-off between compression efficiency and high-fidelity novel view synthesis.

Technology Category

Application Category

📝 Abstract
3D Gaussian Splatting (3DGS) is a major breakthrough in 3D scene reconstruction. With a number of views of a given object or scene, the algorithm trains a model composed of 3D gaussians, which enables the production of novel views from arbitrary points of view. This freedom of movement is referred to as 6DoF for 6 degrees of freedom: a view is produced for any position (3 degrees), orientation of camera (3 other degrees). On large scenes, though, the input views are acquired from a limited zone in space, and the reconstruction is valuable for novel views from the same zone, even if the scene itself is almost unlimited in size. We refer to this particular case as 3DoF+, meaning that the 3 degrees of freedom of camera position are limited to small offsets around the central position. Considering the problem of coordinate quantization, the impact of position error on the projection error in pixels is studied. It is shown that the projection error is proportional to the squared inverse distance of the point being projected. Consequently, a new quantization scheme based on spherical coordinates is proposed. Rate-distortion performance of the proposed method are illustrated on the well-known Garden scene.
Problem

Research questions and friction points this paper is trying to address.

Quantizing 3DGS for large scenes with limited camera positions
Studying position error impact on projection error in pixels
Proposing spherical coordinate quantization for improved performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

3DoF+ quantization for large scenes
Spherical coordinate quantization scheme
Rate-distortion performance evaluation
🔎 Similar Papers
No similar papers found.
M
Matthieu Gendrin
Orange Innovation
S
Stéphane Pateux
Orange Innovation
Théo Ladune
Théo Ladune
Ph. D., Video Coding Researcher @ Orange
Image and Video CodingDeep Learning