What you see depends on what you hear: temporal averaging and crossmodal integration

What you see depends on what you hear: temporal averaging and crossmodal integration

Featured, Research
In our multisensory world, we often rely more on auditory information than on visual input for temporal processing. One typical demonstration of this is that the rate of auditory flutter assimilates the rate of concurrent visual flicker. To date, however, this auditory dominance effect has largely been studied using regular auditory rhythms. It thus remains unclear whether irregular rhythms would have a similar impact on visual temporal processing, what information is extracted from the auditory sequence that comes to influence visual timing, and how the auditory and visual temporal rates are integrated together in quantitative terms. We investigated these questions by assessing, and modeling, the influence of a task-irrelevant auditory sequence on the type of "Ternus apparent motion": group motion versus element motion. The type of motion seen critically depends…
Read More