Path planning and obstacle avoidance for omni-directional mobile robot based on Kinect depth sensor
by Ching-Yi Chen; Shu-Yin Chiang; Chun-Te Wu
International Journal of Embedded Systems (IJES), Vol. 8, No. 4, 2016

Abstract: To enable robots to autonomously move and skirt obstacles in the field, the robots must be designed so that they can exercise self-localisation and path planning functions based on the results of sensor signal processing or image analysis. This research combines the Kinect depth sensor with the omni-directional camera to construct the vision system of FIRA RoboSot. We use the above two complementary devices to enable robots to position themselves and to detect their distance from obstacles. The data captured can be processed and used to construct the required digital map, assisting robots in completing path planning. For testing the performance of our approach, some experiment results show the ability of the proposed system to perform robust obstacle avoidance under indoor conditions. All these results have been proven for their excellence and feasibility.

Online publication date: Fri, 15-Jul-2016

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Embedded Systems (IJES):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com