🤖 AI Summary
Traditional precomputed radiance transfer (PRT) relies on costly offline preprocessing, limiting support for dynamic geometry and real-time shading. This paper proposes the first real-time rendering pipeline integrating conformal geometric algebra (CGA) with neural radiance fields (NeRF): vertex positions and surface normals are jointly encoded as CGA elements, and spherical harmonics are reformulated within the CGA framework to efficiently model light source rotation—enabling dynamic illumination simulation without precomputation. The method supports animated meshes, deformable models, and 3D Gaussian splatting, achieving real-time, high-fidelity rendering in Unity at speeds comparable to conventional PRT, while significantly improving adaptability to dynamic scenes. Its core contribution is the first incorporation of CGA into the NeRF architecture, yielding a compact, differentiable, and rotation-invariant lighting representation for dynamic geometry—thereby eliminating PRT’s dependency on static precomputation and enabling deployment on resource-constrained platforms such as mobile devices and VR systems.
📝 Abstract
This paper presents Neural-GASh, a novel real-time shading pipeline for 3D meshes, that leverages a neural radiance field architecture to perform image-based rendering (IBR) using Conformal Geometric Algebra (CGA)-encoded vertex information as input. Unlike traditional Precomputed Radiance Transfer (PRT) methods, that require expensive offline precomputations, our learned model directly consumes CGA-based representations of vertex positions and normals, enabling dynamic scene shading without precomputation. Integrated seamlessly into the Unity engine, Neural-GASh facilitates accurate shading of animated and deformed 3D meshes - capabilities essential for dynamic, interactive environments. The shading of the scene is implemented within Unity, where rotation of scene lights in terms of Spherical Harmonics is also performed optimally using CGA. This neural field approach is designed to deliver fast and efficient light transport simulation across diverse platforms, including mobile and VR, while preserving high rendering quality. Additionally, we evaluate our method on scenes generated via 3D Gaussian splats, further demonstrating the flexibility and robustness of Neural-GASh in diverse scenarios. Performance is evaluated in comparison to conventional PRT, demonstrating competitive rendering speeds even with complex geometries.