Belatedly – why do so many of my posts involve use of the word “belatedly,” I wonder? – here’s a quick post about a research study that was written up in the New York Times in December. (Yes, I know that was last year.)
It’s about implicit (and hidden) bias – natural prejudices that we all have (and are often blind to), no matter how open-minded or broad thinking we think we are. It’s an important issue, which I’ve touched on before, and especially now for journalists in an increasingly polarized world where it can be tempting to take a side and see proponents of other views in a less-than-flattering light.
The bad news is that we can’t really avoid our internal biases. The good news is that, if we work at it, we can mitigate its effects.
The study itself is pretty straightforward: Experimenters set up a game with volunteers, who would then witness what they thought was another player cheating. The volunteers then got to decide how harsh a punishment should be meted out to the “cheaters.” The idea was to see if the volunteers would punish wrongdoers more harshly if they thought they were from a different group – say, supporters of an rival sports team – than if they thought they were “one of them.”
And what happened? You can pretty much guess the result.
When people made their decisions swiftly — in a few seconds or less — they were biased in their punishment decisions. Not only did they punish out-group members more harshly, they also treated members of their own group more leniently. The same pattern of bias emerged in a pair of follow-up experiments in which we distracted half of the punishers.
Well, duh. But the fact that this kind of tribalism is deeply ingrained in humanity doesn’t make it any better. And for journalists, whose job is often to assess people’s credibility quickly and make decisions about how the information they provide will be used – in other words, to figure out how much to trust someone and how to treat them in a story – this should be sobering news.
We often think we’re unbiased or objective, but what the experiment shows is that we can be easily swayed by how similar we think the person we’re interviewing is to us.
But there is good news: You can lessen the impact of such biases, sometimes simply by taking a little more time to think about it.
But we also found that people could overcome these biased instincts if they engaged in rational deliberation. When people had the chance to reflect on their decision, they were largely unbiased, handing out equal punishments to in-group and out-group members.
So it’s probably not as easy as that in the real world, where you sometimes have to make decisions very quickly, or where people are in some cases actively trying to deceive you.
But regardless of the difficultly of combating our internal, in-built prejudices, the critical first step is to be aware of them. Maybe we can’t fix them – but at least we’ll know we have something that needs to be fixed.
Acknowledging the truth about ourselves — that we see and think about the world through the lens of group affiliations — is the first step to making things better.