3D hand gesture tracking and recognition for controlling an intelligent wheelchair Online publication date: Fri, 18-Apr-2014
by Xiaodong Xu; Yi Zhang; Shuo Zhang; Huosheng Hu
International Journal of Computer Applications in Technology (IJCAT), Vol. 49, No. 2, 2014
Abstract: Hand gesture recognition is a user-friendly and intuitive means for human machine interaction. This paper proposes a novel 3D hand gesture recognition method for controlling an intelligent wheelchair based on both colour and depth information. Image depth information of human palm is obtained by a 3D Kinect vision sensor and then its position is obtained through the hand analysis module in OpenNI. The improved Centroid Distance Function is used to extract 3D hand trajectory features, while hidden Markov model (HMM) is applied to train samples and recognise hand gesture trajectories. Finally, the recognition results are converted into control commands through an ad hoc network and sent to an intelligent wheelchair for its motion control. Experiment results show that the proposed method has good invariance to lighting changes, hand rotation and scaling conditions and is very robust to background interference.
Online publication date: Fri, 18-Apr-2014
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Computer Applications in Technology (IJCAT):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email email@example.com