Don't freeze: Finetune encoders for better Self-Supervised HAR

07/03/2023
by   Vitor Fortes Rey, et al.
0

Recently self-supervised learning has been proposed in the field of human activity recognition as a solution to the labelled data availability problem. The idea being that by using pretext tasks such as reconstruction or contrastive predictive coding, useful representations can be learned that then can be used for classification. Those approaches follow the pretrain, freeze and fine-tune procedure. In this paper we will show how a simple change - not freezing the representation - leads to substantial performance gains across pretext tasks. The improvement was found in all four investigated datasets and across all four pretext tasks and is inversely proportional to amount of labelled data. Moreover the effect is present whether the pretext task is carried on the Capture24 dataset or directly in unlabelled data of the target dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2022

Self-Supervised Human Activity Recognition with Localized Time-Frequency Contrastive Representation Learning

In this paper, we propose a self-supervised learning solution for human ...
research
05/20/2022

Contrastive Learning with Cross-Modal Knowledge Mining for Multimodal Human Activity Recognition

Human Activity Recognition is a field of research where input data can t...
research
09/26/2022

Self-supervised similarity models based on well-logging data

Adopting data-based approaches leads to model improvement in numerous Oi...
research
06/11/2021

A comprehensive solution to retrieval-based chatbot construction

In this paper we present the results of our experiments in training and ...
research
10/05/2022

RankMe: Assessing the downstream performance of pretrained self-supervised representations by their rank

Joint-Embedding Self Supervised Learning (JE-SSL) has seen a rapid devel...
research
06/24/2020

PredNet and Predictive Coding: A Critical Review

PredNet, a deep predictive coding network developed by Lotter et al., co...
research
11/30/2019

Probing the State of the Art: A Critical Look at Visual Representation Evaluation

Self-supervised research improved greatly over the past half decade, wit...

Please sign up or login with your details

Forgot password? Click here to reset