Title: Multimodal information perception based active human-computer interaction

Authors: Wen Zhao; Ye-Peng Guan

Addresses: School of Communication and Information Engineering, Shanghai University, Shanghai, China ' School of Communication and Information Engineering, Shanghai University, Shanghai, China

Abstract: Human-computer interaction (HCI) has great potential for applications in many fields. A HCI method has been proposed based on audio-visual perception and feedback. The interactive target can be determined in different interactive models including gazing, hand pointing, eye-hand coordination, speech and fusion of these audio-visual modalities in a non-contact and non-wearable ways. Interaction in audio-visual integration is feedback timely to user with an immersive experience. A mode analysis based fusion strategy is proposed to fuse the interactive responses from audio-visual modalities especially when there are deviations from the modalities at the same time. The developed approach can be applied with a better performance even in single modality with multiple interactive users in the scenario. Besides, the diversity of interactive habits among multiple users is considered in an ordinary hardware from a crowded scene. Experiments have highlighted that the proposed approach has superior HCI performance by comparisons.

Keywords: HCI; human-computer interaction; audio-visual perception and feedback; immersive experience; fusion strategy.

DOI: 10.1504/IJCAT.2017.087332

International Journal of Computer Applications in Technology, 2017 Vol.56 No.2, pp.141 - 151

Available online: 03 Oct 2017 *

Full-text access for editors Access for subscribers Purchase this article Comment on this article