Decentralized Privacy-Preserving Federal Learning of Computer Vision Models on Edge Devices

📅 2026-01-08
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
While federated learning avoids direct data sharing, model parameters can still leak sensitive client information, and existing approaches often overlook privacy threats posed by other clients. This work proposes a decentralized federated learning framework that simultaneously addresses privacy risks from both the central server and peer clients. Implemented on the NVIDIA Jetson TX2 edge platform, the study systematically evaluates the impact of privacy-preserving techniques—including homomorphic encryption, gradient compression, and noise injection—on model accuracy. Furthermore, it provides the first empirical validation that split learning architectures inherently resist data reconstruction attacks in computer vision tasks. These findings offer practical insights for enabling efficient and privacy-aware collaborative learning in resource-constrained edge environments.

Technology Category

Application Category

📝 Abstract
Collaborative training of a machine learning model comes with a risk of sharing sensitive or private data. Federated learning offers a way of collectively training a single global model without the need to share client data, by sharing only the updated parameters from each client's local model. A central server is then used to aggregate parameters from all clients and redistribute the aggregated model back to the clients. Recent findings have shown that even in this scenario, private data can be reconstructed only using information about model parameters. Current efforts to mitigate this are mainly focused on reducing privacy risks on the server side, assuming that other clients will not act maliciously. In this work, we analyzed various methods for improving the privacy of client data concerning both the server and other clients for neural networks. Some of these methods include homomorphic encryption, gradient compression, gradient noising, and discussion on possible usage of modified federated learning systems such as split learning, swarm learning or fully encrypted models. We have analyzed the negative effects of gradient compression and gradient noising on the accuracy of convolutional neural networks used for classification. We have shown the difficulty of data reconstruction in the case of segmentation networks. We have also implemented a proof of concept on the NVIDIA Jetson TX2 module used in edge devices and simulated a federated learning process.
Problem

Research questions and friction points this paper is trying to address.

federated learning
privacy-preserving
edge devices
data reconstruction
computer vision
Innovation

Methods, ideas, or system contributions that make the work stand out.

federated learning
privacy-preserving
edge computing
gradient compression
homomorphic encryption
🔎 Similar Papers
No similar papers found.