Two years later we have a much better idea of what may have been a reason for the varied perceptions: People’s perceived color is also informed by their perception of lighting. And the image of the dress, taken on a cellphone, contained a lot of uncertainty in terms of lighting conditions. Was it taken inside or outside? This matters because it implies artificial or natural light. Was the dress illuminated from the front or the back? This matters because if it was back-lit, it would be in a shadow, otherwise not.
The brain cannot be accused of epistemic modesty. It is well-known that in situations like this—where it faces profound uncertainty—it confidently fills in the gaps in knowledge by making assumptions. Usually, its assumptions are based on what it has most frequently encountered in the past. For instance, if the sensory information is more uncertain, observers will estimate object speeds to be slower than they actually are, presumably because slow objects are much more common in the environment than fast ones. (Indeed, most objects in any given field of view don’t move at all.) Color and lighting are no exception.
As the illumination conditions are impossible to clearly assess in the dress image, people make assumptions about what they are. Different people do this in differing ways, which is what causes the different interpretations of color. At least, that’s what my research shows, thanks to 13,000 people, including many Slate readers, who took surveys on what they saw when they saw the dress and also compiled other information about how they generally perceived the photo and the world.
Join the conversation as a VIP Member