Evaluating Machine Learning-Driven Intrusion Detection Systems in IoT: Performance and Energy Consumption

📅 2025-04-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing machine learning (ML) and deep learning (DL) intrusion detection systems (IDS) for IoT edge environments lack empirical, multi-dimensional evaluation of performance–energy trade-offs under realistic workloads. Method: This study conducts the first systematic measurement of CPU utilization, energy consumption, and inference latency of ML/DL-IDS on real edge platforms under both benign and adversarial network traffic, while investigating the impact of software-defined networking (SDN) on dynamic resource orchestration and detection efficacy. We integrate SDN-based centralized control, real-time traffic emulation, multi-dimensional system monitoring, and ANOVA-based statistical validation. Results: Under attack, ML-IDS exhibits 47% higher average CPU utilization and 39% increased energy consumption; SDN reduces detection latency by 22% but incurs 8–15% control-plane overhead; DL models improve accuracy by 6.2% yet double inference energy cost. The work establishes empirical performance–energy trade-off patterns and SDN-mediated optimization mechanisms, providing foundational insights for designing lightweight, energy-aware edge security architectures.

Technology Category

Application Category

📝 Abstract
In the evolving landscape of the Internet of Things (IoT), Machine Learning (ML)-based Intrusion Detection Systems (IDS) represent a significant advancement, especially when integrated with Software-Defined Networking (SDN). These systems play a critical role in enhancing security infrastructure within resource-constrained IoT systems. Despite their growing adoption, limited research has explored the impact of ML-based IDS on key performance metrics, such as CPU load, CPU usage, and energy consumption, particularly under real-time cyber threats. This study bridges that gap through an empirical evaluation of cutting-edge ML-based IDSs deployed at the edge of IoT networks under both benign and attack scenarios. Additionally, we investigate how SDN's centralized control and dynamic resource management influence IDS performance. Our experimental framework compares traditional ML-based IDS with deep learning (DL)-based counterparts, both with and without SDN integration. Results reveal that edge-deployed ML-based IDSs significantly impact system performance during cyber threats, with marked increases in resource consumption. SDN integration further influences these outcomes, emphasizing the need for optimized architectural design. Statistical analysis using ANOVA confirms the significance of our findings. This research provides critical insights into the performance and trade-offs of deploying ML-based IDSs in edge-based IoT systems.
Problem

Research questions and friction points this paper is trying to address.

Evaluating ML-based IDS performance in IoT under cyber threats
Assessing energy and CPU impact of edge-deployed IDS solutions
Analyzing SDN integration effects on intrusion detection efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

ML-based IDS for IoT security enhancement
SDN integration optimizes IDS performance
Edge deployment evaluates real-time cyber threats
🔎 Similar Papers
No similar papers found.