Title: Fusion strategy-based multimodal human-computer interaction

Authors: Shu Yang; Ye-peng Guan

Addresses: Shanghai University, No. 99, Shangda Road, ,Shanghai, China ' Shanghai University, No. 99, Shangda Road, ,Shanghai, China

Abstract: Human-computer interaction (HCI) has great potential for applications in many fields. The diversity of interaction habits and low recognition rate are main factors to limit its development. In this paper, a framework of multi-modality based human-computer interaction is constructed. Interactive target can be determined by different modalities including gaze, hand pointing and speech in a non-contact and non-wearable way. The corresponding response is feedback timely to users in the form of audio-visual sense with an immersive experience. Besides, the decision matrix-based fusion strategy is proposed to improve the system's accuracy and adapt to different interaction habits which are considered in an ordinary hardware from a crowded scene without any hypothesis that the interactive user and his corresponding actions are known in advance. Experimental results have highlighted that the proposed method has better robustness and real-time performance in the actual scene by comparisons.

Keywords: human-computer interaction; HCI; multi-modality; audio-visual feedback; interaction habits; fusion strategy.

DOI: 10.1504/IJCVR.2018.093075

International Journal of Computational Vision and Robotics, 2018 Vol.8 No.3, pp.300 - 317

Received: 02 May 2017
Accepted: 17 Oct 2017

Published online: 07 Jul 2018 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article