Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech

Banks, Briony and Gowen, Emma and Munro, Kevin and Adank, Patti (2021) Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech. Journal of Speech, Language, and Hearing Research. ISSN 1092-4388

[img]
Text (Banks_et_al_2021_JSLHR_accepted_manuscript)
Banks_et_al_2021_JSLHR_accepted_manuscript.pdf - Accepted Version
Restricted to Repository staff only until 1 March 2022.
Available under License Creative Commons Attribution-NonCommercial.

Download (625kB)

Abstract

Purpose Visual cues from a speaker's face may benefit perceptual adaptation to degraded speech, but current evidence is limited. We aimed to replicate results from previous studies to establish the extent to which visual speech cues can lead to greater adaptation over time, extending existing results to a real-time adaptation paradigm (i.e., without a separate training period). A second aim was to investigate whether eye gaze patterns toward the speaker's mouth were related to better perception, hypothesizing that listeners who looked more at the speaker's mouth would show greater adaptation. Method A group of listeners (n = 30) was presented with 90 noise-vocoded sentences in audiovisual format, whereas a control group (n = 29) was presented with the audio signal only. Recognition accuracy was measured throughout and eye tracking was used to measure fixations toward the speaker's eyes and mouth in the audiovisual group. Results Previous studies were partially replicated: The audiovisual group had better recognition throughout and adapted slightly more rapidly, but both groups showed an equal amount of improvement overall. Longer fixations on the speaker's mouth in the audiovisual group were related to better overall accuracy. An exploratory analysis further demonstrated that the duration of fixations to the speaker's mouth decreased over time. Conclusions The results suggest that visual cues may not benefit adaptation to degraded speech as much as previously thought. Longer fixations on a speaker's mouth may play a role in successfully decoding visual speech cues; however, this will need to be confirmed in future research to fully understand how patterns of eye gaze are related to audiovisual speech recognition. All materials, data, and code are available at https://osf.io/2wqkf/.

Item Type:
Journal Article
Journal or Publication Title:
Journal of Speech, Language, and Hearing Research
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/3600/3616
Subjects:
ID Code:
156128
Deposited By:
Deposited On:
14 Jun 2021 16:05
Refereed?:
Yes
Published?:
Published
Last Modified:
02 Sep 2021 05:16