Sequence-to-Sequence Model with Transformer-based Attention Mechanism and Temporal Pooling for Non-Intrusive Load Monitoring

06/08/2023
by   Mohammad Irani Azad, et al.
0

This paper presents a novel Sequence-to-Sequence (Seq2Seq) model based on a transformer-based attention mechanism and temporal pooling for Non-Intrusive Load Monitoring (NILM) of smart buildings. The paper aims to improve the accuracy of NILM by using a deep learning-based method. The proposed method uses a Seq2Seq model with a transformer-based attention mechanism to capture the long-term dependencies of NILM data. Additionally, temporal pooling is used to improve the model's accuracy by capturing both the steady-state and transient behavior of appliances. The paper evaluates the proposed method on a publicly available dataset and compares the results with other state-of-the-art NILM techniques. The results demonstrate that the proposed method outperforms the existing methods in terms of both accuracy and computational efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2020

Memory Attentive Fusion: External Language Model Integration for Transformer-based Sequence-to-Sequence Model

This paper presents a novel fusion method for integrating an external la...
research
07/12/2019

R-Transformer: Recurrent Neural Network Enhanced Transformer

Recurrent Neural Networks have long been the dominating choice for seque...
research
02/21/2020

Transformer Hawkes Process

Modern data acquisition routinely produce massive amounts of event seque...
research
06/26/2021

Short-Term Load Forecasting for Smart HomeAppliances with Sequence to Sequence Learning

Appliance-level load forecasting plays a critical role in residential en...
research
10/10/2021

Poformer: A simple pooling transformer for speaker verification

Most recent speaker verification systems are based on extracting speaker...
research
09/26/2021

Short-Term Load Forecasting Using Time Pooling Deep Recurrent Neural Network

Integration of renewable energy sources and emerging loads like electric...
research
11/21/2022

PS-Transformer: Learning Sparse Photometric Stereo Network using Self-Attention Mechanism

Existing deep calibrated photometric stereo networks basically aggregate...

Please sign up or login with your details

Forgot password? Click here to reset