Prompting Large Language Models for Training-Free Non-Intrusive Load Monitoring
Prompting Large Language Models for Training-Free Non-Intrusive Load Monitoring
Non-intrusive load monitoring (NILM) aims to disaggregate aggregate household electricity consumption into individual appliance usage and thus enables more effective energy management. While deep learning has advanced NILM, it remains limited by its dependence on labeled data, restricted generalization, and lack of explainability. This paper introduces the first prompt-based NILM framework that leverages large language models (LLMs) with in-context learning. We design and evaluate prompt strategies that integrate appliance features, timestamps and contextual information, as well as representative time-series examples on widely used open datasets. With optimized prompts, LLMs achieve competitive state detection accuracy and demonstrate robust generalization without the need for fine-tuning. LLMs also enhance explainability by providing clear, human-readable explanations for their predictions. Our results show that LLMs can reduce data requirements, improve adaptability, and provide transparent energy disaggregation in NILM applications.
Shicheng Liu、Guoming Tang、Yi Wang、Junyu Xue、Xudong Wang、Xiaoling He
电气测量技术、电气测量仪器自动化技术、自动化技术设备
Shicheng Liu,Guoming Tang,Yi Wang,Junyu Xue,Xudong Wang,Xiaoling He.Prompting Large Language Models for Training-Free Non-Intrusive Load Monitoring[EB/OL].(2025-05-09)[2025-07-16].https://arxiv.org/abs/2505.06330.点此复制
评论