Percepta: High Performance Stream Processing at the Edge

📅 2025-10-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Edge environments face significant challenges in processing multi-source, heterogeneous IoT data streams and enabling real-time closed-loop AI—particularly reinforcement learning (RL)—due to protocol incompatibility, disparate sampling rates, data loss, and high latency. Method: This paper proposes a lightweight, edge-AI–oriented data stream processing architecture that tightly integrates online reward computation, model retraining data caching, and real-time feature engineering. It supports multi-source synchronization, adaptive normalization, protocol translation, and fault-tolerant recovery, all within an edge computing framework to enable tightly coupled inference-and-learning loops. Contribution/Results: Evaluated on resource-constrained edge devices, the architecture achieves sub-100-ms end-to-end latency, significantly improves data robustness, and enhances the stability of AI model self-adaptation and optimization. It provides a deployable foundational infrastructure for dynamic edge intelligence.

Technology Category

Application Category

📝 Abstract
The rise of real-time data and the proliferation of Internet of Things (IoT) devices have highlighted the limitations of cloud-centric solutions, particularly regarding latency, bandwidth, and privacy. These challenges have driven the growth of Edge Computing. Associated with IoT appears a set of other problems, like: data rate harmonization between multiple sources, protocol conversion, handling the loss of data and the integration with Artificial Intelligence (AI) models. This paper presents Percepta, a lightweight Data Stream Processing (DSP) system tailored to support AI workloads at the edge, with a particular focus on such as Reinforcement Learning (RL). It introduces specialized features such as reward function computation, data storage for model retraining, and real-time data preparation to support continuous decision-making. Additional functionalities include data normalization, harmonization across heterogeneous protocols and sampling rates, and robust handling of missing or incomplete data, making it well suited for the challenges of edge-based AI deployment.
Problem

Research questions and friction points this paper is trying to address.

Addressing latency and bandwidth limitations in IoT edge computing
Handling data harmonization across heterogeneous protocols and rates
Supporting AI model retraining and real-time decision making
Innovation

Methods, ideas, or system contributions that make the work stand out.

Lightweight DSP system for edge AI workloads
Reward computation and real-time data preparation
Handles heterogeneous protocols and missing data
🔎 Similar Papers
2024-07-20IEEE Transactions on Mobile ComputingCitations: 0