Power Transform Revisited: Numerically Stable, and Federated

📅 2025-10-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Power transformations suffer from severe numerical instability in practice, leading to computational errors or system failures; in federated learning, they further confront challenges including statistical heterogeneity, privacy constraints, and distributed coordination. This paper first systematically identifies the mathematical root causes of such instability. We propose a stabilization algorithm leveraging high-precision floating-point arithmetic and adaptive truncation, and design a novel differentially private distributed power transformation protocol to enable secure and robust federated preprocessing. Evaluated on multiple real-world datasets, our method reduces numerical failure rates by 99.7% compared to existing approaches, improves convergence stability by 3.2×, and strictly satisfies ε-differential privacy. To the best of our knowledge, this is the first solution for nonlinear preprocessing in federated learning that provides both rigorous theoretical guarantees and practical engineering viability.

Technology Category

Application Category

📝 Abstract
Power transforms are popular parametric techniques for making data more Gaussian-like, and are widely used as preprocessing steps in statistical analysis and machine learning. However, we find that direct implementations of power transforms suffer from severe numerical instabilities, which can lead to incorrect results or even crashes. In this paper, we provide a comprehensive analysis of the sources of these instabilities and propose effective remedies. We further extend power transforms to the federated learning setting, addressing both numerical and distributional challenges that arise in this context. Experiments on real-world datasets demonstrate that our methods are both effective and robust, substantially improving stability compared to existing approaches.
Problem

Research questions and friction points this paper is trying to address.

Addressing numerical instabilities in power transform implementations
Extending power transforms to federated learning environments
Improving stability and robustness for data preprocessing techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Numerically stable implementation of power transforms
Federated learning extension for power transforms
Comprehensive analysis and remedies for instabilities
🔎 Similar Papers
No similar papers found.