Auditory Context Alters Visual Perception
Visual inputs are often obscured, distorted, or ambivalent, and to form meaningful representations of incoming information, our visual system relies not only on the visual features of the object itself but takes into account the surrounding context. Most studies have focused on how visual context influences visual object perception, and it is less clear how concurrent auditory information about objects—the sound of a lawnmower, or the whistling of a tea kettle—influences what objects we see and how we experience them. Here, we investigate whether naturalistic sounds modulate the representation of visual objects. We used a visual discrimination task and a novel set of ambiguous object stimuli that were paired at random with related or unrelated sounds. Specifically, we created ambiguous stimuli by morphing together the features of two objects (Object A, Object B, e.g., a hammer and a seal), and presented these ambiguous morph stimuli with naturalistic sounds that were related to either Object A or B. Visual objects and sounds were presented simultaneously, and at the end of each trial, participants indicated what object they saw using continuous report. Overall, we found that sounds biased visual object recognition, such that the perceptual representation was pulled towards the object features that matched the sound (Exp. 1a-1b). For example, the same ambiguous hammer-seal object would appear more seal-like when paired with the sound of seal barking, but more hammer-like when paired with the sound of a hammer hitting. In various control experiments, we show that this effect is not driven by response bias (Exp. 2a-2b), and not due to a general effect of expectation (Exp.3). These results indicate that visual object representations are biased by contextual auditory information due to the continuous integration of auditory and visual information during real-world perception.