glassoformer: a query-sparse transformer for post-fault power grid voltage prediction

01/22/2022
by   Yunling Zheng, et al.
0

We propose GLassoformer, a novel and efficient transformer architecture leveraging group Lasso regularization to reduce the number of queries of the standard self-attention mechanism. Due to the sparsified queries, GLassoformer is more computationally efficient than the standard transformers. On the power grid post-fault voltage prediction task, GLassoformer shows remarkably better prediction than many existing benchmark algorithms in terms of accuracy and stability.

READ FULL TEXT
research
06/17/2021

Efficient Conformer with Prob-Sparse Attention Mechanism for End-to-EndSpeech Recognition

End-to-end models are favored in automatic speech recognition (ASR) beca...
research
08/03/2021

Armour: Generalizable Compact Self-Attention for Vision Transformers

Attention-based transformer networks have demonstrated promising potenti...
research
10/11/2022

Robustify Transformers with Robust Kernel Density Estimation

Recent advances in Transformer architecture have empowered its empirical...
research
11/20/2022

Convexifying Transformers: Improving optimization and understanding of transformer networks

Understanding the fundamental mechanism behind the success of transforme...
research
08/24/2023

Easy attention: A simple self-attention mechanism for Transformers

To improve the robustness of transformer neural networks used for tempor...
research
03/30/2020

Code Prediction by Feeding Trees to Transformers

In this paper, we describe how to leverage Transformer, a recent neural ...
research
05/17/2018

Deep-learning Based Modeling of Fault Detachment Stability for Power Grid

The project intends to model the stability of power system with a deep l...

Please sign up or login with your details

Forgot password? Click here to reset