When comparing two objects, people either rely on internal memories of these objects or run their hands and eyes over them to directly perceive their similarity. The latter approach, a shortcut that offloads cognition to the active perceptual operations like eye or hand movements, requires a lower memory burden. In a study published in the journal Attention, Perception, & Psychophysics , SFI Complexity Postdoctoral Fellow Marina Dubova and SFI Research Fellow Arseny Moskvichev demonstrate that it is also more effective.
They devised a series of experiments where participants had to identify whether a pair of images—which were blurred and could only be viewed in isolated sections—were identical. Dubova and Moskvichev found that participants relied heavily on perceptual offloading, marked by frequent comparisons between images in a pair, particularly when they were harder to discriminate. However, when participants could not switch between images, their responses were slower and less accurate.
In these cases, participants often vocalized details, using language to offload cognition—a tactic that allowed the participants to stay accurate, but only as long as the stimuli were easy to verbalize. These insights highlight the importance of active, perceptual procedures and language in visual comparison, suggesting applications in visual AI research. More information: Marina Dubova et al, The role of active perception and naming in sameness comparison, Attention, Perception, & Psychophysics (2025).
DOI: 10.3758/s13414-025-03046-1.
Health
Study finds active perception aids object comparison accuracy

When comparing two objects, people either rely on internal memories of these objects or run their hands and eyes over them to directly perceive their similarity. The latter approach, a shortcut that offloads cognition to the active perceptual operations like eye or hand movements, requires a lower memory burden.