An LSTM Recurrent Network for Step Counting

02/10/2018
by   Ziyi Chen, et al.
0

Smartphones with sensors such as accelerometer and gyroscope can be used as pedometers and navigators. In this paper, we propose to use an LSTM recurrent network for counting the number of steps taken by both blind and sighted users, based on an annotated smartphone sensor dataset, WeAllWork. The models were trained separately for sighted people, blind people with a long cane or a guide dog for Leave-One-Out training modality. It achieved 5 undercount rate.

READ FULL TEXT
research
09/19/2019

Open Challenges of Blind People using Smartphones

Blind people face significant challenges when using smartphones. The foc...
research
05/23/2020

An IoT based Voice Controlled Blind Stick to Guide Blind People

Visually impaired people find it difficult to identify objects in front ...
research
05/06/2023

Toucha11y: Making Inaccessible Public Touchscreens Accessible

Despite their growing popularity, many public kiosks with touchscreens a...
research
12/05/2022

Indoor room Occupancy Counting based on LSTM and Environmental Sensor

This paper realizes the estimation of classroom occupancy by using the C...
research
11/15/2017

People, Penguins and Petri Dishes: Adapting Object Counting Models To New Visual Domains And Object Types Without Forgetting

In this paper we propose a technique to adapt a convolutional neural net...
research
06/09/2019

LSTM Networks Can Perform Dynamic Counting

In this paper, we systematically assess the ability of standard recurren...
research
01/19/2021

Promoting Self-Efficacy Through an Effective Human-Powered Nonvisual Smartphone Task Assistant

Accessibility assessments typically focus on determining a binary measur...

Please sign up or login with your details

Forgot password? Click here to reset