Authors: Scott A. Green, J. Geoffrey Chase, XiaoQi Chen, Mark Billinghurst
Addresses: Department of Mechanical Engineering, University of Canterbury, Christchurch, New Zealand. ' Department of Mechanical Engineering, University of Canterbury, Christchurch, New Zealand. ' Department of Mechanical Engineering, University of Canterbury, Christchurch, New Zealand. ' Human Interface Technology Laboratory, NZ (HITLab NZ), University of Canterbury, Christchurch, New Zealand
Abstract: This article discusses an experimental comparison of three user interface techniques for interaction with a remotely located robot. A typical interface for such a situation is to teleoperate the robot using a camera that displays the robot|s view of its work environment. However, the operator often has a difficult time maintaining situation awareness due to this single egocentric view. Hence, a multimodal system was developed enabling the human operator to view the robot in its remote work environment through an augmented reality interface, the augmented reality human-robot collaboration (AR-HRC) system. The operator uses spoken dialogue, reaches into the 3D representation of the remote work environment and discusses intended actions of the robot. The result of the comparison was that the AR-HRC interface was found to be most effective, increasing accuracy by 30%, while reducing the number of close calls in operating the robot by factors of ∼3x. It thus provides the means to maintain spatial awareness and give the users the feeling of working in a true collaborative environment.
Keywords: augmented reality; communication; HRI; human-robot interaction; human-robot collaboration; intelligent systems; user interfaces; remote robots; teleoperation; remote work environment; spatial awareness; spoken dialogue.
International Journal of Intelligent Systems Technologies and Applications, 2010 Vol.8 No.1/2/3/4, pp.130 - 143
Available online: 11 Dec 2009 *Full-text access for editors Access for subscribers Purchase this article Comment on this article