ELECTRIcity: An Efficient Transformer for Non-Intrusive Load Monitoring

Non-Intrusive Load Monitoring (NILM) describes the process of inferring the consumption pattern of appliances by only having access to the aggregated household signal. Sequence-tosequence deep learning models have been firmly established as state-of-the-art approaches for NILM, in an attempt to iden...

Full description

Bibliographic Details
Main Authors: Doulamis, A. (Author), Doulamis, N. (Author), Kaselimi, M. (Author), Sykiotis, S. (Author)
Format: Article
Language:English
Published: MDPI 2022
Subjects:
Online Access:View Fulltext in Publisher
LEADER 02583nam a2200385Ia 4500
001 10-3390-s22082926
008 220425s2022 CNT 000 0 und d
020 |a 14248220 (ISSN) 
245 1 0 |a ELECTRIcity: An Efficient Transformer for Non-Intrusive Load Monitoring 
260 0 |b MDPI  |c 2022 
856 |z View Fulltext in Publisher  |u https://doi.org/10.3390/s22082926 
520 3 |a Non-Intrusive Load Monitoring (NILM) describes the process of inferring the consumption pattern of appliances by only having access to the aggregated household signal. Sequence-tosequence deep learning models have been firmly established as state-of-the-art approaches for NILM, in an attempt to identify the pattern of the appliance power consumption signal into the aggregated power signal. Exceeding the limitations of recurrent models that have been widely used in sequential modeling, this paper proposes a transformer-based architecture for NILM. Our approach, called ELECTRIcity, utilizes transformer layers to accurately estimate the power signal of domestic appliances by relying entirely on attention mechanisms to extract global dependencies between the aggregate and the domestic appliance signals. Another additive value of the proposed model is that ELECTRIcity works with minimal dataset pre-processing and without requiring data balancing. Furthermore, ELECTRIcity introduces an efficient training routine compared to other traditional transformer-based architectures. According to this routine, ELECTRIcity splits model training into unsupervised pre-training and downstream task fine-tuning, which yields performance increases in both predictive accuracy and training time decrease. Experimental results indicate ELECTRIcity’s superiority compared to several state-of-the-art methods. © 2022 by the authors. Licensee MDPI, Basel, Switzerland. 
650 0 4 |a Consumption patterns 
650 0 4 |a deep learning 
650 0 4 |a Deep learning 
650 0 4 |a Deep learning 
650 0 4 |a Disaggregation 
650 0 4 |a Domestic appliances 
650 0 4 |a Electric load management 
650 0 4 |a Energy 
650 0 4 |a energy disaggregation 
650 0 4 |a Energy disaggregation 
650 0 4 |a imbalanced data 
650 0 4 |a Imbalanced data 
650 0 4 |a NILM 
650 0 4 |a non-intrusive load monitoring 
650 0 4 |a Nonintrusive load monitoring 
650 0 4 |a Power signals 
650 0 4 |a Transformer 
650 0 4 |a transformers 
700 1 |a Doulamis, A.  |e author 
700 1 |a Doulamis, N.  |e author 
700 1 |a Kaselimi, M.  |e author 
700 1 |a Sykiotis, S.  |e author 
773 |t Sensors