Development of an eye-tracking control system using AForge.NET framework Online publication date: Wed, 06-Mar-2013
by Suraj Verma; Prashant Pillai; Yim-Fun Hu
International Journal of Intelligent Systems Technologies and Applications (IJISTA), Vol. 11, No. 3/4, 2012
Abstract: Assistive technology is taken one step ahead with the introduction of sophisticated eye-tracking and gaze-tracking techniques which track the movement of the eye and the gaze location to control various applications. This paper describes in detail the low-cost hardware development and the software architecture for a real-time eye-tracking based wireless control system using the open-source image processing framework of AForge.NET. The system developed has been tested in the field of remote robotic navigation, using the Lego NXT Mindstorm robot, and wireless home automation, using the X10 transmission protocol. The system designed uses a wireless camera to detect the eye movements and transmit control signals over a wireless channel. This provides a highly useful alternative control system for a person suffering from quadriplegia (i.e. full body paralysis). Tests conducted to measure the processing time, system accuracy and error aid in evaluating the performance of the developed eye-tracking system.
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Intelligent Systems Technologies and Applications (IJISTA):
Login with your Inderscience username and password:
Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.
If you still need assistance, please email subs@inderscience.com