RapidHARe: A computationally inexpensive method for real-time human activity recognition from wearable sensors
Recent human activity recognition (HAR) methods, based on on-body inertial sensors, have achieved increasing performance; however, this is at the expense of longer CPU calculations and greater energy consumption. Therefore, these complex models might not be suitable for real-time prediction in mobile systems, e.g., in elder-care support and long-term health-monitoring systems. Here, we present a new method called RapidHARe for real-time human activity recognition based on modeling the distribution of a raw data in a half-second context window using dynamic Bayesian networks. Our method does not employ any dynamic-programming-based algorithms, which are notoriously slow for inference, nor does it employ feature extraction or selection methods. In our comparative tests, we show that RapidHARe is an extremely fast predictor, one and a half times faster than artificial neural networks (ANNs) methods, and more than eight times faster than recurrent neural networks (RNNs) and hidden Markov models (HMMs). Moreover, in performance, RapidHare achieves an F1 score of 94.27% and accuracy of 98.94%, and when compared to ANN, RNN, HMM, it reduces the F1-score error rate by 45%, 65%, and 63% and the accuracy error rate by 41%, 55%, and 62%, respectively. Therefore, RapidHARe is suitable for real-time recognition in mobile devices.
READ FULL TEXT