For a species so steeped in visual information, humans actually aren’t very good at picking up on change (your childhood success with Highlights magazine puzzles notwithstanding). But a new computer-based model is shedding light on what we see and what we don’t even when the obvious is right in front of us.
“Change blindness” is the term for our keen ability to overlook even drastic changes to the scenes around us. Researchers have long studied this peculiarity of our visual intelligence by presenting test subjects with photos of the same scene slightly altered (just like the barstool arcade games based purely on this same bit of biology).
But when humans are in charge of altering the photos, it introduces a human bias into the experiment. So researchers at Queen Mary University of London developed an algorithm that let’s a computer choose how each image set is manipulated. Studies conducted with the system showed that change blindness can be predicted, and it also turned up some interesting, perhaps counterintuitive, aspects of our visual intelligence.
For instance, researchers thought changes in color would be easier for humans to pick out. But subjects detected the addition or removal of objects far better than a change in color, surprisingly.
Applying that knowledge to the real world, we could design better visual cues within our in a wide range of scenarios. Cities could design better signage for roadways and marketers could better learn how to draw the eye to a certain display or advertisement.