Authors: T. Subetha; S. Chitrakala; M. Uday Theja
Addresses: Department of Information and Technology, BVRIT Hyderabad College of Engineering for Women, Hyderabad, Telangana, India ' Department of Computer Science and Engineering, CEG Anna University, Chennai, Tamil Nadu, India ' Department of Computer Science and Engineering, Indian Institute of Technology Madras (IITM), Chennai, Tamil Nadu, India
Abstract: Human Activity Recognition (HAR) aims to realise and interpret human activities from videos, and it comprises background subtraction, feature extraction and classification stages. Among those stages, the background subtraction stage is mandatory to achieve a better recognition rate while analysing the videos. The proposed Fusion-based Gaussian Mixture Model (FGMM) background subtraction algorithm extracts the foreground from videos invariant to illumination, shadows and the dynamic background. The proposed FGMM algorithm consists of three stages: background detection, colour similarity and colour distortion calculation. Here, the Jefries-Matusita distance measure is utilised to check whether the current pixel matches the Gaussian distribution, and by using this value, the background model is updated. Weighted Euclidean based colour similarity measure is used to eliminate shadows, and colour distortion measure is adopted to handle illumination variations. The extracted foreground is binarised to easily extract the foreground's interest points, which has white pixels stored into the frame. This algorithm has experimented over test sets gathered from publicly available benchmark data sets such as the K-th data set, Weizmann data set, PETS data set and change detection data set. Experimental results have proved that the proposed FGMM exhibits better accuracy in foreground detection, with better accuracy than the prevailing approaches.
Keywords: human activity recognition; Gaussian mixture model; fusion-based Gaussian mixture model; background subtraction.
International Journal of Computer Applications in Technology, 2021 Vol.66 No.1, pp.63 - 73
Received: 30 Apr 2020
Accepted: 12 Nov 2020
Published online: 11 Dec 2021 *