Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR

Lystbæk, Mathias and Pfeuffer, Ken and Grønbæk, Jens Emil and Gellersen, Hans (2022) Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR. Proceedings of the ACM on Human-Computer Interaction - CSCW, 6 (ETRA): 141. pp. 1-16. ISSN 2573-0142

[thumbnail of no-copyright-141-CameraReady_Text_Entry_S-GHA_ETRA2022]
Text (no-copyright-141-CameraReady_Text_Entry_S-GHA_ETRA2022)
no_copyright_141_CameraReady_Text_Entry_S_GHA_ETRA2022.pdf - Accepted Version

Download (7MB)

Abstract

With eye-tracking increasingly available in Augmented Reality, we explore how gaze can be used to assist freehand gestural text entry. Here the eyes are often coordinated with manual input across the spatial positions of the keys. Inspired by this, we investigate gaze-assisted selection-based text entry through the concept of spatial alignment of both modalities. Users can enter text by aligning both gaze and manual pointer at each key, as a novel alternative to existing dwell-time or explicit manual triggers. We present a text entry user study comparing two of such alignment techniques to a gaze-only and a manual-only baseline. The results show that one alignment technique reduces physical finger movement by more than half compared to standard in-air finger typing, and is faster and exhibits less perceived eye fatigue than an eyes-only dwell-time technique. We discuss trade-offs between uni and multimodal text entry techniques, pointing to novel ways to integrate eye movements to facilitate virtual text entry.

Item Type:
Journal Article
Journal or Publication Title:
Proceedings of the ACM on Human-Computer Interaction - CSCW
ID Code:
169243
Deposited By:
Deposited On:
01 Nov 2022 16:40
Refereed?:
Yes
Published?:
Published
Last Modified:
10 Jan 2024 00:33