Self-Attention for Audio Super-Resolution

Convolutions operate only locally, thus failing to model global interactions. Self-attention is, however, able to learn representations that capture long-range dependencies in sequences. We propose a network architecture for audio super-resolution that combines convolution and self-attention. Attention-based Feature-Wise Linear Modulation (AFiLM) uses self-attention mechanism instead of recurrent neural networks to modulate the activations of the convolutional model. Extensive experiments show that our model outperforms existing approaches on standard benchmarks. Moreover, it allows for more parallelization resulting in significantly faster training.

READ FULL TEXT
research
03/17/2023

SRFormer: Permuted Self-Attention for Single Image Super-Resolution

Previous works have shown that increasing the window size for Transforme...
research
06/12/2023

Recurrent Attention Networks for Long-text Modeling

Self-attention-based models have achieved remarkable progress in short-t...
research
09/14/2019

Temporal FiLM: Capturing Long-Range Sequence Dependencies with Feature-Wise Modulations

Learning representations that accurately capture long-range dependencies...
research
09/21/2022

Multi-Field De-interlacing using Deformable Convolution Residual Blocks and Self-Attention

Although deep learning has made significant impact on image/video restor...
research
02/27/2019

Bridging the Gap: Attending to Discontinuity in Identification of Multiword Expressions

We introduce a new method to tag Multiword Expressions (MWEs) using a li...
research
10/21/2020

AttendAffectNet: Self-Attention based Networks for Predicting Affective Responses from Movies

In this work, we propose different variants of the self-attention based ...
research
04/23/2018

QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension

Current end-to-end machine reading and question answering (Q&A) models a...

Please sign up or login with your details

Forgot password? Click here to reset