Title: Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair

Authors: Ericka Janet Rechy-Ramirez; Huosheng Hu

Addresses: School of Computer Science and Electronic Engineering, University of Essex, Colchester CO4 3SQ, UK ' School of Computer Science and Electronic Engineering, University of Essex, Colchester CO4 3SQ, UK

Abstract: This paper presents a human machine interface (HMI) for hands-free control of an electric powered wheelchair (EPW) based on head movements and facial expressions detected by using the gyroscope and 'cognitiv suite' of an Emotiv EPOC device, respectively. The proposed HMI provides two control modes: 1) control mode 1 uses four head movements to display in its graphical user interface the control commands that the user wants to execute and one facial expression to confirm its execution; 2) control mode 2 employs two facial expressions for turning and forward motion, and one head movement for stopping the wheelchair. Therefore, both control modes offer hands-free control of the wheelchair. Two subjects have used the two control modes to operate a wheelchair in an indoor environment. Five facial expressions have been tested in order to determine if the users can employ different facial expressions for executing the commands. The experimental results show that the proposed HMI is reliable for operating the wheelchair safely.

Keywords: human machine interface; HMI; head movements; facial expressions; Emotiv sensor; wheelchair control; intelligent wheelchairs; electric wheelchairs; EPWs; gyroscope; graphical user interface; GUI; wheelchair turning; forward motion; wheelchair stopping; hands-free control; indoor environment; operating safety.

DOI: 10.1504/IJBBR.2014.064920

International Journal of Biomechatronics and Biomedical Robotics, 2014 Vol.3 No.2, pp.80 - 91

Published online: 30 Sep 2014 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article