Much has been written about the departure of Ezra Klein, of Wonkblog fame, from the Washington Post to a new “Project X” at Vox Media; some coverage has focused on the rise of journalists’ brands in their own right, while other pieces have looked at the business viability of such a venture.
But the one that interested me the most was Benjamin Wallace’s piece in New York magazine that discussed Klein’s philosophy of news and the problem he wanted to try to solve.
Klein’s theory of the news grew out of his frustration with the industry’s relentless presentism, with the fact that, because media organizations prioritize what’s new (that’s why it’s called news), an article about the latest development in Syria’s civil war would likely not mention the single most important fact necessary to understand what is happening: the historical enmity between Alawites and Sunnis. There is little allowance made for readers coming to a story late and an assumption that anyone who’s been following a story over time will remember all the relevant contextual information.
You have to love the word ‘presentism’, and it’s a great one to steal; certainly Wonkblog has done its share to get beyond the day’s news and focus on things that Klein believes matters to readers, and to examine them in a lively, but detailed way. It’s not designed in the way that Homicide Watch, Politifact or Connected China are – which is to say, with an underlying data structure that aggregates daily updates into more persistent types of content – but it hits all the themes of structured journalism, from the rethinking of what news and the atomic structure of news should be, to the new ways people come to information, to the idea that we haven’t begun to really take advantage of the longevity of content in the digital age. As the New York piece notes:
The answer, as Klein sees it, lies in the handling of what he calls “persistent content,” the more static information that makes the new stuff make sense. And here, he believes, the Internet has untapped potential. Traditional media organizations have taken advantage of the Internet’s speed but not its longevity.
People who think about digital journalism distinguish between what they call unchanging “stock content” and ephemeral “flow content.” Klein believes that distinction is unhelpfully stark. “We’re interested in ending the ‘versus’ there,” he says. “We believe there are rivers and lakes of content that work together.”
Absolutely. The goal shouldn’t be to run two news organizations, one “fast” and one “in-depth” that don’t speak to each other; it should be to figure out how to more effectively turn the first into the building blocks of the second. (Which isn’t to say we don’t need independent, in-depth reporting – of course we do. But we should be able to both create more value out of the daily reporting we do as well as collect data that will help us when we do want to dig deeper. More importantly, such a structure lets users have access to up-to-date information when they want it, not when we decide to write it. )
The trickier question is how to build newsroom systems that will allow those rivers and lakes to exist effectively, and that’s a tougher challenge. Some single-purpose sites – Politifact, Homicide Watch and Connected China – have managed it, but solving for more general, all-purpose news is a much bigger problem. Just as importantly, being able to run such sites at scale is an equally big challenge. So far, most of the ventures into this area have been relatively small and focused, and maybe that’s the way of the future.
There’s also the question of how to best integrate narrative into such data-driven sites. People want up-to-date, contextual information, it’s true – but they also want well-written, engaging narrative stories as well. Creating stories at scale and at speed is an expensive business, and while machine-generated stories may ultimately turn out to be a solution here, we’re still a little ways away from that.
Another question is how context is defined and built into the content. As the Atlantic’s Conor Friedersdorf notes in a recent piece,”presentism” isn’t the only issue that the media needs to address.
In The Press Effect, Kathleen Hall Jamieson and Paul Waldman argue that reporting on presidential campaigns is distorted by reporters’ tendency to see all new events through the lens of established narratives. During Election 200o, Al Gore was tagged as a liar and George W. Bush as an idiot. Any grain of truth in those stereotypes was exaggerated due to confirmation bias. I raise this example because the problem wasn’t journalism that focused on the present at the expense of the important. Rather, the attempt to filter the present through what was perceived to be important went astray, because journalists had simplistic, wrongheaded notions of what was important. The reporting was undermined by flawed context.
It’s true, of course, that context – like any content – carries some kind of bias. But the question for structured journalism news organizations is whether or not assumptions are hard-wired into the way they collect data, and how nimble they can be if those prove to be unfounded.
That said, it’s encouraging to hear the plans for Project X; it’s certainly not like this patch is overcrowded with people experimenting with new forms of journalism, and it’s great to see more such ventures.