A facial feature tracker for human-computer interaction based on 3D Time-Of-Flight cameras Online publication date: Tue, 18-Nov-2008
by Martin Bohme, Martin Haker, Thomas Martinetz, Erhardt Barth
International Journal of Intelligent Systems Technologies and Applications (IJISTA), Vol. 5, No. 3/4, 2008
Abstract: We describe a facial feature tracker based on the combined range and amplitude data provided by a 3D Time-Of-Flight camera. We use this tracker to implement a head mouse, an alternative input device for people who have limited use of their hands. The facial feature tracker is based on geometric features that are related to the intrinsic dimensionality of multidimensional signals. We show how the position of the nose in the image can be determined robustly using a very simple bounding-box classifier, trained on a set of labelled sample images. Despite its simplicity, the classifier generalises well to subjects that it was not trained on. An important result is that the combination of range and amplitude data dramatically improves robustness compared to a single type of data. The tracker runs in real time at around 30 frames per second. We demonstrate its potential as an input device by using it to control Dasher, an alternative text input tool.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Systems Technologies and Applications (IJISTA):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com