Have you ever seen colors in an optical illusion that weren’t actually there? Or wondered why some people saw “the dress” as white and gold while others saw it as blue and black? In essence, how can colors appear different from what they actually are?
Lighting and Perception
Sometimes, the answer lies in lighting; other times, it depends on our memories or what our photoreceptors are doing.
In 2015, a photo of a dress sparked a heated debate with a simple question: What color is it? “The dress was so unusual; we really don’t have many controversies about colors,” Bevil Conway, a neuroscientist and visual scientist at the National Institutes of Health, told Live Science. “We don’t disagree about white and gold or blue and black. The disagreement is about whether or not those colors applied to this image.”
Conway and his team analyzed the dilemma by asking 1,400 participants what they thought the dress’s color was if the illumination changed. They found that people’s expectations of the lighting influenced their perception. Those who assumed warm or incandescent light saw blue and black (its actual color), while those who assumed cool or daylight saw white and gold.
Memory and Color Perception
Memory also plays a role in how we see color. When we view a familiar object, our brains assign it its expected hue or even enhance its color. In a 2024 study, researchers asked participants to identify the color of 17 objects under five different lighting conditions. Despite these conditions, participants identified the objects’ original colors accurately, demonstrating a phenomenon known as color constancy.
This memory effect explains why you might “see” color in the dark even without light stimulation: your brain constructs color based on memory. When the object is unfamiliar, your brain may assign color based on expectations. For instance, a train image created by psychologist Akiyoshi Kitaoka can appear to have blue pixels, even though it doesn’t.
Context and Color Intensity
An object’s positioning or context can also affect color perception. A red object appears “redder” on a green background than on a white one, demonstrating how neighboring colors can influence perceived hues.
Tired Photoreceptors
Sometimes, our cones—color photoreceptor cells in the retina—can trick the brain into “seeing” something that’s not there. For example, staring at a flag image for 30 to 60 seconds and then looking at a white space can result in an afterimage where the flag appears in red and blue due to photoreceptor fatigue.
Most people have three types of cones: long, middle, and short, each detecting different wavelengths. When staring at a red sheet of paper, the long cone works harder. If we then look at a white sheet, the middle and short cones compensate, creating a perceived green color—an effect known as a negative afterimage.
Ongoing Research
There is still much we don’t understand about color perception, especially regarding which neurons compare cone activity in the retina. Advancing our understanding requires productive dialogue between art, philosophy, and science.