Enhancing Implicit Neural Representations via Symmetric Power Transformation

📅 2024-12-12
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited representational capacity of implicit neural representations (INRs), this paper proposes the Reversible Symmetric Power (RSP) transformation, which jointly enforces target-range constraints and enhances symmetry through data redistribution—without requiring additional storage. We introduce the novel “Range-Defined Symmetric Hypothesis” and design a bias-aware calibration module alongside an adaptive soft-boundary mechanism to preserve output continuity while improving robustness. Extensive experiments on 1D audio, 2D image, and 3D video fitting tasks demonstrate that RSP consistently achieves higher PSNR and SSIM compared to baselines—including random permutation and standard normalization—across diverse modalities. The method exhibits strong generalizability and stability, validating both its theoretical foundation and practical efficacy in enhancing INR performance.

Technology Category

Application Category

📝 Abstract
We propose symmetric power transformation to enhance the capacity of Implicit Neural Representation~(INR) from the perspective of data transformation. Unlike prior work utilizing random permutation or index rearrangement, our method features a reversible operation that does not require additional storage consumption. Specifically, we first investigate the characteristics of data that can benefit the training of INR, proposing the Range-Defined Symmetric Hypothesis, which posits that specific range and symmetry can improve the expressive ability of INR. Based on this hypothesis, we propose a nonlinear symmetric power transformation to achieve both range-defined and symmetric properties simultaneously. We use the power coefficient to redistribute data to approximate symmetry within the target range. To improve the robustness of the transformation, we further design deviation-aware calibration and adaptive soft boundary to address issues of extreme deviation boosting and continuity breaking. Extensive experiments are conducted to verify the performance of the proposed method, demonstrating that our transformation can reliably improve INR compared with other data transformations. We also conduct 1D audio, 2D image and 3D video fitting tasks to demonstrate the effectiveness and applicability of our method.
Problem

Research questions and friction points this paper is trying to address.

Enhancing Implicit Neural Representations with symmetric power transformation
Improving INR training via Range-Defined Symmetric Hypothesis
Addressing extreme deviation and continuity in data transformation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Symmetric power transformation enhances INR capacity
Reversible operation without extra storage needed
Deviation-aware calibration improves transformation robustness
🔎 Similar Papers
No similar papers found.
Weixiang Zhang
Weixiang Zhang
Tsinghua University
Neural Representation3D Computer Vision
Shuzhao Xie
Shuzhao Xie
Tsinghua University
GraphicsMultimedia
C
Chengwei Ren
Tsinghua Shenzhen International Graduate School, Tsinghua University, Shenzhen, China
Shijia Ge
Shijia Ge
Tsinghua University
Machine LearningAI3DVRoboticsAI4Med
M
Mingzi Wang
Tsinghua Shenzhen International Graduate School, Tsinghua University, Shenzhen, China
Z
Zhi Wang
Tsinghua Shenzhen International Graduate School, Tsinghua University, Shenzhen, China