Description and application of the correlation between gaze and hand for the different hand events occurring during interaction with tablets

Weill-Tessier, Pierre (2020) Description and application of the correlation between gaze and hand for the different hand events occurring during interaction with tablets. PhD thesis, UNSPECIFIED.

[thumbnail of 2020PierreWeill-TessierPHD]
Text (2020PierreWeill-TessierPHD)
2020PierreWeill_TessierPHD.pdf - Published Version
Available under License Creative Commons Attribution-NonCommercial-NoDerivs.

Download (17MB)

Abstract

People’s activities naturally involve the coordination of gaze and hand. Research in Human-Computer Interaction (HCI) endeavours to enable users to exploit this multimodality for enhanced interaction. With the abundance of touch screen devices, direct manipulation of an interface has become a dominating interaction technique. Although touch enabled devices are prolific in both public and private spaces, interactions with these devices do not fully utilise the benefits from the correlation between gaze and hand. Touch enabled devices do not employ the richness of the continuous manual activity above their display surface for interaction and a lot of information expressed by users through their hand movements is ignored. This thesis aims at investigating the correlation between gaze and hand during natural interaction with touch enabled devices to address these issues. To do so, we set three objectives. Firstly, we seek to describe the correlation between gaze and hand in order to understand how they operate together: what is the spatial and temporal relationship between these modalities when users interact with touch enabled devices? Secondly, we want to know the role of some of the inherent factors brought by the interaction with touch enabled devices on the correlation between gaze and hand, because identifying what modulates the correlation is crucial to design more efficient applications: what are the impacts of the individual differences, the task characteristics and the features of the on-screen targets? Thirdly, as we want to see whether additional information related to the user can be extracted from the correlation between gaze and hand, we investigate the latter for the detection of users’ cognitive state while they interact with touch enabled devices: can the correlation reveal the users’ hesitation? To meet the objectives, we devised two data collections for gaze and hand. In the first data collection, we cover the manual interaction on-screen. In the second data collection, we focus instead on the manual interaction in-the-air. We dissect the correlation between gaze and hand using three common hand events users perform while interacting with touch enabled devices. These events comprise taps, stationary hand events and the motion between taps and stationary hand events. We use a tablet as a touch enabled device because of its medium size and the ease to integrate both eye and hand tracking sensors. We study the correlation between gaze and hand for tap events by collecting gaze estimation data and taps on tablet in the context of Internet related tasks, representative of typical activities executed using tablets. The correlation is described in the spatial and temporal dimensions. Individual differences and effects of the task nature and target type are also investigated. To study the correlation between gaze and hand when the hand is in a stationary situation, we conducted a data collection in the context of a Memory Game, chosen to generate enough cognitive load during playing while requiring the hand to leave the tablet’s surface. We introduce and evaluate three detection algorithms, inspired by eye tracking, based on the analogy between gaze and hand patterns. Afterwards, spatial comparisons between gaze and hands are analysed to describe the correlation. We study the effects on the task difficulty and how the hesitation of the participants influences the correlation. Since there is no certain way of knowing when a participant hesitates, we approximate the hesitation with the failure of matching a pair of already seen tiles. We study the correlation between gaze and hand during hand motion between taps and stationary hand events from the same data collection context than the case mentioned above. We first align gaze and hand data in time and report the correlation coefficients in both X and Y axis. After considering the general case, we examine the impact of the different factors implicated in the context: participants, task difficulty, duration and type of the hand motion. Our results show that the correlation between gaze and hand, throughout the interaction, is stronger in the horizontal dimension of the tablet rather than in its vertical dimension, and that it varies widely across users, especially spatially. We also confirm the eyes lead the hand for target acquisition. Moreover, we find out that the correlation between gaze and hand when the hand is in the air above the tablet’s surface depends on where the users look at on the tablet. As well, we show that the correlation during eye and hand during stationary hand events can indicate the users’ indecision, and that while the hand is moving, the correlation depends on different factors, such as the degree of difficulty of the task performed on the tablet and the nature of the event before/after the motion.

Item Type:
Thesis (PhD)
ID Code:
139939
Deposited By:
Deposited On:
10 Jan 2020 14:25
Refereed?:
No
Published?:
Published
Last Modified:
16 Sep 2023 02:49