Accoustate: Auto-annotation of IMU-generated Activity Signatures under Smart Infrastructure

12/08/2021
by   Soumyajit Chatterjee, et al.
0

Human activities within smart infrastructures generate a vast amount of IMU data from the wearables worn by individuals. Many existing studies rely on such sensory data for human activity recognition (HAR); however, one of the major bottlenecks is their reliance on pre-annotated or labeled data. Manual human-driven annotations are neither scalable nor efficient, whereas existing auto-annotation techniques heavily depend on video signatures. Still, video-based auto-annotation needs high computation resources and has privacy concerns when the data from a personal space, like a smart-home, is transferred to the cloud. This paper exploits the acoustic signatures generated from human activities to label the wearables' IMU data at the edge, thus mitigating resource requirement and data privacy concerns. We utilize acoustic-based pre-trained HAR models for cross-modal labeling of the IMU data even when two individuals perform simultaneous but different activities under the same environmental context. We observe that non-overlapping acoustic gaps exist with a high probability during the simultaneous activities performed by two individuals in the environment's acoustic context, which helps us resolve the overlapping activity signatures to label them individually. A principled evaluation of the proposed approach on two real-life in-house datasets further augmented to create a dual occupant setup, shows that the framework can correctly annotate a significant volume of unlabeled IMU data from both individuals with an accuracy of 82.59% (± 17.94%) and 98.32% (± 3.68%), respectively, for a workshop and a kitchen environment.

READ FULL TEXT

page 1

page 4

page 9

research
05/14/2020

Enabling Edge Cloud Intelligence for Activity Learning in Smart Home

We propose a novel activity learning framework based on Edge Cloud archi...
research
10/19/2018

AudioAR: Audio-Based Activity Recognition with Large-Scale Acoustic Embeddings from YouTube Videos

Activity sensing and recognition have been demonstrated to be critical i...
research
06/22/2023

AmicroN: A Framework for Generating Annotations for Human Activity Recognition with Granular Micro-Activities

Efficient human activity recognition (HAR) using sensor data needs a sig...
research
03/07/2022

HAR-GCNN: Deep Graph CNNs for Human Activity Recognition From Highly Unlabeled Mobile Sensor Data

The problem of human activity recognition from mobile sensor data applie...
research
09/21/2018

Arianna+: Scalable Human Activity Recognition by Reasoning with a Network of Ontologies

Aging population ratios are rising significantly. Meanwhile, smart home ...
research
03/21/2018

Modelling the Influence of Cultural Information on Vision-Based Human Home Activity Recognition

Daily life activities, such as eating and sleeping, are deeply influence...
research
01/19/2021

Machine-Generated Hierarchical Structure of Human Activities to Reveal How Machines Think

Deep-learning based computer vision models have proved themselves to be ...

Please sign up or login with your details

Forgot password? Click here to reset