🤖 AI Summary
To address the single-point failure, privacy leakage, and poor scalability of collaborative learning in IoT environments, this paper proposes the first decentralized architecture integrating Federated Split Learning (FSL) with a permissioned blockchain. Built on Hyperledger Fabric, it introduces a smart-contract-driven distributed model aggregation mechanism, leveraging Private Data Collections (PDCs) and transient fields to ensure local data never leaves its domain—thereby eliminating reliance on a central server while preserving both privacy and scalability. Experimental evaluation on CIFAR-10 and MNIST demonstrates that the approach achieves accuracy comparable to centralized FSL, significantly outperforms Ethereum-based alternatives in training efficiency, and incurs negligible blockchain overhead. The design is production-ready for enterprise deployment.
📝 Abstract
Collaborative machine learning in sensitive domains demands scalable, privacy preserving solutions for enterprise deployment. Conventional Federated Learning (FL) relies on a central server, introducing single points of failure and privacy risks, while Split Learning (SL) partitions models for privacy but scales poorly due to sequential training. We present a decentralized architecture that combines Federated Split Learning (FSL) with the permissioned blockchain Hyperledger Fabric (HLF). Our chaincode orchestrates FSL's split model execution and peer-to-peer aggregation without any central coordinator, leveraging HLF's transient fields and Private Data Collections (PDCs) to keep raw data and model activations private. On CIFAR-10 and MNIST benchmarks, HLF-FSL matches centralized FSL accuracy while reducing per epoch training time compared to Ethereum-based works. Performance and scalability tests show minimal blockchain overhead and preserved accuracy, demonstrating enterprise grade viability.