Visually stimulated motor control for a robot with a pair of LGMD visual neural networks
by Shigang Yue; F. Claire Rind
International Journal of Advanced Mechatronic Systems (IJAMECHS), Vol. 4, No. 5/6, 2012

Abstract: In this paper, we proposed a visually stimulated motor control (VSMC) system for autonomous navigation of mobile robots. Inspired from a locusts' motion sensitive interneuron - lobula giant movement detector (LGMD), the presented VSMC system enables a robot exploring local paths or interacting with dynamic objects effectively using visual input only. The VSMC consists of a pair of LGMD visual neural networks and a simple motor command generator. Each LGMD processes images covering part of the wide field of view and extracts relevant visual cues. The outputs from the two LGMDs are compared and interpreted into executable motor commands directly. These motor commands are then executed by the robot's wheel control system in real-time to generate corresponded motion adjustment accordingly. Our experiments showed that this bio-inspired VSMC system worked well in different scenarios.

Online publication date: Sat, 30-Aug-2014

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Advanced Mechatronic Systems (IJAMECHS):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?

Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email