Multimodal Integration of Natural Gaze Behavior for Intention Recognition During Object Manipulation
Proceedings of the 2009 International Conference on Multimodal Interfaces (ICMI-MLMI), ACM, 2009.
Eleventh International Conference on Multimodal Interfaces and Sixth Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI), Cambridge, USA, 2.-4. November 2009
Naturally gaze is used for visual perception of our environment and gaze movements are mainly controlled subconsciously. Forcing the user to consciously diverge from that natural gaze behavior for interaction purposes causes high cognitive workload and destroys information contained in natural gaze movements. Instead of proposing a new gazebased interaction technique, we analyze natural gaze behavior during an object manipulation task and show ways how it can be used for intention recognition, which provides a universal basis for integrating gaze into multimodal interfaces for different applications. We propose a model for multimodal integration of natural gaze behavior and evaluate it for two different use cases, namely for improvement of robustness of other potentially noisy input cues and for the design of proactive interaction techniques.