Consistently, when the paired images were concordant—both pleasant or both unpleasant—the subjects did a good job of intuiting their partner’s feelings. When the images were discordant, however—one good or one bad—they tended to misfire. Someone who had seen maggots would lowball how snuggly a partner who had seen the bunny felt; the person who had seen the bunny, meanwhile, would underestimate the level of disgust the partner who saw the maggots was experiencing.
OK, so it’s hard to feel vicariously cuddly when you’re feeling personally revolted. The research proved that unsurprising premise. In other trials, however, the investigators went further, running the same experiments, but this time scanning the subjects’ brains with functional magnetic resonance imaging (fMRI). In general, they found that when people were asked to judge discordant feelings—I feel pleasant, you probably don’t—there was a spike of activity in a region known as the right supramarginal gyrus (rSMG), located near the middle of the brain, just aft of center. When they were judging concordant feelings, the rSMG stayed quieter.
That told the scientists where the empathic error was playing out, but it didn’t tell them how it was working: did activity in the rSMG cause the egocentric bias, or was it trying to suppress it? In other words, if the rSMG stayed quiet, would our empathic skills be better or even worse?