Authors: C.A.O. Silva; G.A.M. Goltz; E.H. Shiguemori; C.L. De Castro; H.F. De C. Velho; A.P. De Braga
Addresses: Federal Institute of North of Minas Gerais, Almenara-MG, Brazil ' Institute of Marine Research, Rio de Janeiro-RJ, Brazil ' Institute of Advanced Studies, São José dos Campos-SP, Brazil ' Federal University of Minas Gerais, Belo Horizonte-MG, Brazil ' National Institute for Space Research, São José dos Campos-SP, Brazil ' Federal University of Minas Gerais, Belo Horizonte-MG, Brazil
Abstract: This paper brings results of an image matching approach applied to the estimation of position for autonomous navigation of unmanned aerial vehicles (UAVs). The main idea is to replace the global positioning system (GPS) signal by matching the onboard video to a georeferenced satellite image. Image processing techniques and edge descriptors are considered in order to achieve robustness to different sensors and luminosity conditions. The UAV position is then calculated by finding the pixel in the georeferenced image providing a higher spatial correlation to the image captured at flight time. The evaluation of our vision system takes into account different kinds of terrains, such as in forests, roads and urban areas and its capacity to follow real routes specified in a flight simulator. The results obtained are promising and indicate that our methodology can be used in substitution for GPS on real flights.
Keywords: unmanned aerial vehicles; UAVs; autonomous navigation; image matching; edge extraction; radial basis function; RBF neural networks; Canny filter; onboard video; georeferenced satellite images; image processing.
International Journal of High Performance Systems Architecture, 2016 Vol.6 No.4, pp.205 - 212
Available online: 20 Jan 2017 *Full-text access for editors Access for subscribers Purchase this article Comment on this article