Towards an Immersive Hearing Device: Evaluation Methods and Behavioural Analysis
* Presenting author
Abstract:
In hearing devices, the biggest benefit in terms of signal enhancement can be achieved by directional filtering. However, directional filtering with narrow ‘beams’ can force hearing device users to apply more head movement, in order to steer the beam towards the desired sources. In case of automatic beam steering, it is not always clear which is the intended source of interest. To overcome these problems, we present the concept of a hearing device that is controlled by a combined analysis of the acoustic environment and the user’s behaviour. Gaze direction is measured using Electro-Oculography. Based on the spatial distribution of acoustic sources and the temporal gaze behaviour, the attended spatial directions are estimated.Such interactive algorithms can only be evaluated if ecologically valid movement behaviour is applied by test participants. This requires careful consideration of the test environments and tasks. In a series of studies we looked at the interaction between virtual audio-visual reproduction, task interactivity and movement-behaviour. It was shown that a high level of validity of head movement behaviour can be achieved in virtual audio-visual environments. However, a task with a true social interaction is required for natural behaviour. Implications on future studies with interactive algorithms are discussed.