GazeSwitch : Automatic Eye-Head Mode Switching for Optimised Hands-Free Pointing

Hou, Baosheng James and Newn, Joshua and Sidenmark, Ludwig and Khan, Anam Ahmad and Gellersen, Hans (2024) GazeSwitch : Automatic Eye-Head Mode Switching for Optimised Hands-Free Pointing. Proceedings of the ACM on Human-Computer Interaction, 8. (In Press)

[thumbnail of _ETRA2024__GazeSwitch__Camera_Ready_]
Text (_ETRA2024__GazeSwitch__Camera_Ready_)
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of _ETRA2024__GazeSwitch__Camera_Ready_]
Text (_ETRA2024__GazeSwitch__Camera_Ready_) - Published Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of _ETRA2024__GazeSwitch__Camera_Ready_]
Text (_ETRA2024__GazeSwitch__Camera_Ready_) - Published Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of _ETRA2024__GazeSwitch__Camera_Ready_]
Text (_ETRA2024__GazeSwitch__Camera_Ready_) - Published Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of _ETRA2024__GazeSwitch__Camera_Ready_]
Text (_ETRA2024__GazeSwitch__Camera_Ready_) - Published Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of _ETRA2024__GazeSwitch__Camera_Ready_]
Text (_ETRA2024__GazeSwitch__Camera_Ready_)
_ETRA2024_GazeSwitch_Camera_Ready_.pdf - Published Version
Available under License Creative Commons Attribution.

Download (4MB)

Abstract

This paper contributes GazeSwitch, an ML-based technique that optimises the real-time switching between eye and head modes for fast and precise hands-free pointing. GazeSwitch reduces false positives from natural head movements and efficiently detects head gestures for input, resulting in an effective hands-free and adaptive technique for interaction. We conducted two user studies to evaluate its performance and user experience. Comparative analyses with baseline switching techniques, Eye+Head Pinpointing (manual) and BimodalGaze (threshold-based) revealed several trade-offs. We found that GazeSwitch provides a natural and effortless experience but trades off control and stability compared to manual mode switching, and requires less head movement compared to BimodalGaze. This work demonstrates the effectiveness of machine learning approach to learn and adapt to patterns in head movement, allowing us to better leverage the synergistic relation between eye and head input modalities for interaction in mixed and extended reality.

Item Type:
Journal Article
Journal or Publication Title:
Proceedings of the ACM on Human-Computer Interaction
Uncontrolled Keywords:
Research Output Funding/yes_externally_funded
Subjects:
?? gaze interactionrefinementeye trackingeye-head coordinationcomputational interactionmachine learningyes - externally funded ??
ID Code:
218980
Deposited By:
Deposited On:
29 Apr 2024 15:20
Refereed?:
Yes
Published?:
In Press
Last Modified:
11 May 2024 02:28