Gaze behavior in online and in-person concert and film viewing
Shreshth Saxena, Maya B. Flannery, Joshua L. Schlichting, and Lauren K. Fink
Department of Psychology, Neuroscience, and Behaviour
McMaster University, Canada
Index spatiotemporal visual attention
AND auditory attention
Investigate multimodal interactions (audio, visual, text, etc.)
Investigate links between behavioural and physiological responses
Inform cinematic and aesthetic choices
1. Fink LK, Hurley BK, Geng JJ, Janata P. A linear oscillator model predicts dynamic temporal attention and pupillary entrainment to rhythmic patterns. J Eye Mov Res. 2018 Nov 20;11(2):10.16910/jemr.11.2.12. doi: 10.16910/jemr.11.2.12. PMID: 33828695; PMCID: PMC7898576. 2. Alreja, A., Ward, M.J., Ma, Q. et al. A new paradigm for investigating real-world social behavior and its neural underpinnings. Behav Res 55, 2333–2352 (2023). https://doi.org/10.3758/s13428-022-01882-9
Eye tracking hardware restricts naturalistic experimentation
Eye tracking hardware restricts naturalistic experimentation
Studies and findings have limited ecological validity
image source: https://www.ed.ac.uk/ppls/psychology
Scaling eye tracking to naturalistic real-world settings
Image source: Unsplash
Saxena, Lange, & Fink. 2022. Towards efficient calibration for webcam eye-tracking in online experiments. ACM Symposium on Eye Tracking Research and Applications (ETRA '22). doi
Saxena, Fink, & Lange, 2023. Deep learning models for webcam eye tracking in online experiments. Behaviour Research Methods. doi
Saxena, Lange, & Fink. 2022. Towards efficient calibration for webcam eye-tracking in online experiments. ACM Symposium on Eye Tracking Research and Applications (ETRA '22). doi
Saxena, Fink, & Lange, 2023. Deep learning models for webcam eye tracking in online experiments. Behaviour Research Methods. doi
(N=59)
(N=20)
Day 1: Film -> Performance
Day 2: Performance -> Film
Film duration: 1 hour 20 minutes
Performance duration: ~1 hour
N = 60 participants (30 each day)
Recruitment from university newsletters and public advertising
38 self-identified as women
Mean age = 34 years, range = (16, 82)
1 dropout on day 1 after the film screening session
Participants equipped with Pupil Labs Neon eye tracking glasses
Worldviews
Saxena, Visram, Lobo, Mirza, Khan, Pirabaharan, Nguyen, & Fink. (2024). Multi-person eye tracking for real-world scene perception in social settings. arXiv preprint arXiv:2407.06345 https://doi.org/10.48550/arXiv.2407.06345
Text
but also daydreaming, disinterest or mind-wandering.
Group 1
Group 2
(Gaze needs to be transformed to a static reference point - Centralview)
Group 1
Group 2
Group 1
Group 2
Group 1
Group 2
Short time frame: 30 second window rolling ISC mean
Long time frame: mean ISC over entire duration
Group 1
Group 2
Overall higher joint gaze dispersion during Performance
Group 1
Group 2
Group 1
Group 2
Group 1
Group 2
Group 1
Group 2
Group 1
Group 2
Analyse and compare the proposed eye-tracking measures from online participants
For the in-person audience, pupil activity is recorded in addition to gaze and blinks. Temporal pupil activity has been linked to audiovisual attention in short-duration trials. The collected data can be used to study long-duration dynamics
Evaluate the influence of auditory vs. visual stimulation on gaze, blinks, and pupil time series