Memory Controlled Sequential Self Attention for Sound Recognition

05/13/2020
by   Arjun Pankajakshan, et al.
0

In this paper we investigate the importance of the extent of memory in sequential self attention for sound recognition. We propose to use a memory controlled sequential self attention mechanism on top of a convolutional recurrent neural network (CRNN) model for polyphonic sound event detection (SED). Experiments on the URBAN-SED dataset demonstrate the impact of the extent of memory on sound recognition performance with the self attention induced SED model. We extend the proposed idea with a multi-head self attention mechanism where each attention head processes the audio embedding with explicit attention width values. The proposed use of memory controlled sequential self attention offers a way to induce relations among frames of sound event tokens. We show that our memory controlled self attention model achieves an event based F -score of 33.92 20.10

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/27/2019

IS Attention All What You Need? – An Empirical Investigation on Convolution-Based Active Memory and Self-Attention

The key to a Transformer model is the self-attention mechanism, which al...
research
07/20/2021

Assessment of Self-Attention on Learned Features For Sound Event Localization and Detection

Joint sound event localization and detection (SELD) is an emerging audio...
research
10/24/2022

Composition, Attention, or Both?

In this paper, we propose a novel architecture called Composition Attent...
research
06/16/2020

Untangling tradeoffs between recurrence and self-attention in neural networks

Attention and self-attention mechanisms, inspired by cognitive processes...
research
08/21/2022

Improving Speech Emotion Recognition Through Focus and Calibration Attention Mechanisms

Attention has become one of the most commonly used mechanisms in deep le...
research
11/28/2019

Self-attention with Functional Time Representation Learning

Sequential modelling with self-attention has achieved cutting edge perfo...
research
05/02/2018

Fast Directional Self-Attention Mechanism

In this paper, we propose a self-attention mechanism, dubbed "fast direc...

Please sign up or login with your details

Forgot password? Click here to reset