DualNILM: Energy Injection Identification Enabled Disaggregation with Deep Multi-Task Learning

📅 2025-08-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional Non-Intrusive Load Monitoring (NILM) methods rely solely on aggregate electricity meter data; however, their performance degrades significantly in residential settings with widespread integration of distributed energy resources (DERs)—such as photovoltaics and battery storage—whose injected power obscures appliance-level signatures. To address this, we propose DualNILM, the first framework that jointly models appliance state detection and DER injection detection as a cooperative multi-task learning problem. Methodologically, DualNILM introduces a Transformer-based dual-path architecture that simultaneously performs sequence-to-point (state classification) and sequence-to-sequence (power disaggregation) modeling, explicitly capturing multi-scale temporal dependencies. Evaluated on both a newly constructed real-world dataset and synthetic benchmarks, DualNILM consistently outperforms state-of-the-art NILM approaches across all key metrics for both tasks. This advancement enhances the robustness and interpretability of load monitoring in source-integrated distribution systems.

Technology Category

Application Category

📝 Abstract
Non-Intrusive Load Monitoring (NILM) offers a cost-effective method to obtain fine-grained appliance-level energy consumption in smart homes and building applications. However, the increasing adoption of behind-the-meter energy sources, such as solar panels and battery storage, poses new challenges for conventional NILM methods that rely solely on at-the-meter data. The injected energy from the behind-the-meter sources can obscure the power signatures of individual appliances, leading to a significant decline in NILM performance. To address this challenge, we present DualNILM, a deep multi-task learning framework designed for the dual tasks of appliance state recognition and injected energy identification in NILM. By integrating sequence-to-point and sequence-to-sequence strategies within a Transformer-based architecture, DualNILM can effectively capture multi-scale temporal dependencies in the aggregate power consumption patterns, allowing for accurate appliance state recognition and energy injection identification. We conduct validation of DualNILM using both self-collected and synthesized open NILM datasets that include both appliance-level energy consumption and energy injection. Extensive experimental results demonstrate that DualNILM maintains an excellent performance for the dual tasks in NILM, much outperforming conventional methods.
Problem

Research questions and friction points this paper is trying to address.

Identifies appliance states and injected energy in NILM
Addresses performance decline from behind-the-meter energy sources
Solves multi-task learning for energy disaggregation challenges
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep multi-task learning framework
Transformer-based architecture integration
Sequence-to-point and sequence-to-sequence strategies
🔎 Similar Papers
No similar papers found.
X
Xudong Wang
School of Data Science, The Chinese University of Hong Kong, Shenzhen, China
Guoming Tang
Guoming Tang
The Hong Kong University of Science and Technology (Guangzhou)
Sustainable Computing/AICloud/Edge ComputingAI4Sus
J
Junyu Xue
Department of Computer Science and Engineering, Southern University of Science and Technology, China
Srinivasan Keshav
Srinivasan Keshav
Professor of Computer Science, University of Cambridge
Computer networkingEnergy systems
T
Tongxin Li
School of Data Science, The Chinese University of Hong Kong, Shenzhen, China
C
Chris Ding
School of Data Science, The Chinese University of Hong Kong, Shenzhen, China