Automatic detection of gaze convergence in multimodal collaboration: a dual eye-tracking technology
The paper analyses the advantages and limitations of the current technical solutions for dual eye-tracking
(DUET) in relation to the research questions from educational science about joint attention in a multimodal teaching/
learning collaboration. The insufficiency of the current systems for the analysis of multimodal collaboration is stated as the reviewed systems do not allow researchers to relate a participant’s eye movements to the video from their joint performance and accompanying gestures without time consuming manual coding. We describe a system of two low-cost Pupil-Labs eyetrackers and propose an open source utility DUET for Pupil that automatically produces synchronized gaze data in the shared system of coordinates. The data are available in the form of a video from the surface that is overlaid by gaze paths with supplementary sound waveforms and as textual data with synchronized coordinates of the two gazes. Our empirical evaluation of this technological solution reports 1.27 ° of visual angle as the spatial accuracy of the system after post-hoc calibration. The advantages, limitations, and further possible enhancments of the system are discussed.