Title: Hybrid machine learning approach for human activity recognition

Authors: Ahmad Taher Azar

Addresses: College of Computer and Information Sciences, Prince Sultan University, Riyadh, Saudi Arabia; Faculty of Computers and Artificial Intelligence, Benha University, Benha, Egypt

Abstract: Human activity recognition is an important component of building situational awareness. Recognising human activity is important in people's daily lives. The requirement to infer different simple to complicated human actions is prominent in order to tackle numerous human-centred challenges, such as healthcare and individual support. This paper presents different supervised machine learning techniques on the classification of smartphone activity feature sets into six different classes of human activity (laying, sitting, standing, walking, walking upstairs and walking downstairs). A hybrid approach of Random Forest Decision Tree (RFDT) and Neural Network (NN) techniques is proposed. The results showed that the hybridised algorithm is superior in classification performance in less time in comparison with a single ML technique like NN or RFDT. The proposed algorithm has a comparable performance with state-of-the-art method (97.5% of accuracy on the UCI data set compared to 97.6% for Convolutional Neural Network (CNN)). But the proposed algorithm is far faster than CNN on CPU (0.073 seconds of inference time for the UCI data set compared to 1.530 seconds for CNN). Unlike CNN, the algorithm is able to be executed on embedded devices like smart-phones which are deprived of dedicated GPU in most cases. It enables the processing of the smartphone data on the device itself without penalising accuracy.

Keywords: neural networks; hybrid algorithm; machine learning; classification; human activity recognition; RFDT; random forest decision tree.

DOI: 10.1504/IJCAT.2023.133307

International Journal of Computer Applications in Technology, 2023 Vol.72 No.3, pp.231 - 239

Received: 25 Oct 2022
Accepted: 05 Dec 2022

Published online: 11 Sep 2023 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article