Title: Exploration of head gesture control for hearing instruments

Authors: Bernd Tessendorf; Franz Gravenhorst; Daniel Roggen; Thomas Stiefmeier; Christina Strohrmann; Gerhard Tröster; Peter Derleth; Manuela Feilner

Addresses: Wearable Computing Lab., ETH Zurich, Gloriastr. 35, 8092 Zurich, Switzerland ' Wearable Computing Lab., ETH Zurich, Gloriastr. 35, 8092 Zurich, Switzerland ' Wearable Computing Lab., ETH Zurich, Gloriastr. 35, 8092 Zurich, Switzerland ' Wearable Computing Lab., ETH Zurich, Gloriastr. 35, 8092 Zurich, Switzerland ' Wearable Computing Lab., ETH Zurich, Gloriastr. 35, 8092 Zurich, Switzerland ' Wearable Computing Lab., ETH Zurich, Gloriastr. 35, 8092 Zurich, Switzerland ' Phonak AG, Laubisrütistrasse 28, 8712 Stäfa, Switzerland ' Phonak AG, Laubisrütistrasse 28, 8712 Stäfa, Switzerland

Abstract: In this work, we investigated the BENEFIT of head gestures as a user interface to control hearing instruments (HIs). We developed a prototype of a head-gesture-controlled HI, which was based on a customised wireless acceleration sensor for unconstrained and continuous real-time monitoring of the user's head movements. We evaluated the system from a technical point of view and achieved a precision of 96% and a recall of 97% for spotting the two head gestures used: tilting the head to the left and right side. We further evaluated the system from the user's point of view based on the feedback from 6 hearing-impaired HI users (4 men, 2 women, age 27-60).We compared our head-gesture-based control to existing HI user interfaces: HI-integrated buttons and HI remote control. We found that the benefit of the different HI interaction solutions depends on the user's current situations and that all participating HI users would appreciate head gesture control as an additional, complementing user interface.

Keywords: multimodal hearing instruments; head gesture control; user interface; recognition algorithm; body-worn sensors; ubiquitous computing; wearable sensors; head gestures; instrument control; wireless sensors; acceleration sensors; real-time monitoring; head movements.

DOI: 10.1504/IJAHUC.2014.064858

International Journal of Ad Hoc and Ubiquitous Computing, 2014 Vol.16 No.4, pp.240 - 249

Received: 10 Nov 2012
Accepted: 10 Jun 2013

Published online: 19 Sep 2014 *

Full-text access for editors Full-text access for subscribers Purchase this article Comment on this article