Eye&Head:Synergetic Eye and Head Movement for Gaze Pointing and Selection

Sidenmark, Ludwig and Gellersen, Hans (2019) Eye&Head:Synergetic Eye and Head Movement for Gaze Pointing and Selection. In: UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. ACM, New York, pp. 1161-1174. ISBN 9781450368162

[img]
Text (Final Accepted Version)
uist19a_sub6643_cam_i32_1_.pdf - Accepted Version
Available under License Creative Commons Attribution-NonCommercial.

Download (3MB)

Abstract

Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection.

Item Type:
Contribution in Book/Report/Proceedings
Additional Information:
© ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology http://doi.acm.org/10.1145/3332165.3347921
Subjects:
ID Code:
136259
Deposited By:
Deposited On:
21 Aug 2019 14:20
Refereed?:
Yes
Published?:
Published
Last Modified:
20 Sep 2020 06:47