Just riffing off my recent riffing off on Noise, the new book by Daniel Kahneman, Olivier Sibony and Cass Sunstein: OK, so people are flawed, illogical and inconsistent, and that leads to injustice – even when the people making the decisions aren’t biased.
(Of course it’s worse when they are.)
And we as journalists should expose injustice in systems whenever we find it – no matter whether the cause is.
But more importantly, we also can and should delve into how those systems could be better. As well as think about how these ideas could apply to our own decision making and our own organizations – such as when we hire or promote, since that’s a place of intersection with systems, journalism and diversity.
So how does one reduce “noise” and inconsistency in decision making?
The book lists a series of practices that help, mostly involving slowing us down from our very human habits of jumping to conclusions, relying on intuition, or building a narrative to support a decision we’re predisposed to make, regardless of the facts. And – more controversially – turning more to algorithms to help us drive more consistency.
These aren’t hard things to do – they’re just a pain, and they take away some of what we think of as our essential humanity: Our ability to make exceptions, look beyond the facts, go with our gut. And sometimes those are good instincts – and many times they’re not.
Among the suggestions: Break judgments up into individual tasks, so we’re less inclined to make holistic decisions, overriding inconvenient facts. Get multiple, independent opinions, and aggregate them into a wisdom-of-crowds judgment. Think statistically, not in terms of narrative or neat causal stories. Resist premature intuitions. Try to make judgments on just the issues that are relevant.
Take hiring: We often based those decisions on interviews and discussions with a hiring panel – possibly one of the worst ways we can find the best candidate. Instead, where there are specific skills we’re looking for that can be quasi-objectively judged, let’s make applicants take tests that measure those skills. And anonymize the entries, so we’re not swayed but who someone is, or how the look, or what school they went to.
If you’re trying to hire a copy editor, have candidates take a copy-editing test. And scrub their names from the submissions. Have the hiring panel grade the results – independently. Then compare notes. If you’re looking for a reporter, get anonymized reporting memos from the applicants.
That’s not to say that you shouldn’t also interview the candidates. But the test may throw up results you weren’t expecting.
Look at the experience of US orchestras, which I wrote about some time ago:
Consider that in 1970 women comprised less than less than 10% of major orchestras in the US and fewer than 20% of new hires. As Mahzarin (Banaji) recounts in her book, back then auditions for new members were conducted in front of a team of seasoned musicians, often from that orchestra. You’d expect that they had well-trained ears, able to select the best candidates. And they largely picked men.
But then an interesting thing happened when they started auditioning candidates behind a curtain, and taking pains not to let the panel know if it was a man or a woman was playing. More women won spots.
Read More…