The answer to the first question is Facebook, at least according to a recent piece in the New York Times about the platform’s News Feed algorithm and its outsized role in determining how many people come to journalism these days.
The answer to the second question is – well, it depends.
It depends, in part, on whether we’re talking about journalism in general, the public interest, or the financial well-being of your site. All of which yield potentially very different answers.
But first, the first question. As the NYT piece notes:
…Facebook is at the forefront of a fundamental change in how people consume journalism. Most readers now come to it not through the print editions of newspapers and magazines or their home pages online, but through social media and search engines driven by an algorithm, a mathematical formula that predicts what users might want to read.
Certainly fewer and fewer people are coming to news sites via the home page; most are referred in through social media and search. And that raises all sorts of questions about how well they’re being informed about things that should matter to them, and, for that matter, about the kind of journalism that’s being created to cater to them.
The Times piece flags the fear that, increasingly, people are being being trapped in a “filter bubble” of only like-minded views, as they both select news that fits their preconceptions and search and filter algorithms become tuned to the sorts of things they’re likely to want to read.
The shift raises questions about the ability of computers to curate news, a role traditionally played by editors. It also has broader implications for the way people consume information, and thus how they see the world.
Fair enough. And there’s no question news consumption habits are being re-invented at a dizzying pace. But it’s not clear that machine curation of news is that much worse than human curation; it’s different, certainly, but just as machines do some things more poorly than humans, they also do some things better than humans. A person looking for in-depth LGBT news in the mainstream media 20 years ago wouldn’t have been well-served by human editors then, and while that reader today may well get a better collection of such stories from a human editor, there probably aren’t enough editors out there to create great packages for every conceivable set of interests. Is having a machine curate a so-so set of stories for my interests that much worse than having a human curate a set of stories that I’m only marginally interested in?
True, there’s also the notion that there are stories all people should be exposed to, and have an interest in – but ask any marginalized group in any society whether that set of stories responds to their needs, and it’s not always clear that it does.
Nor are filter bubbles restricted to the world of algorithms. Readers have always gravitated to news organizations that reflect their world view, whether Fox News or the Guardian. Computers make it easier to ignore things you disagree with, but the issue seems to be more human nature than technology.
In any case, machines – and machine curation – are here to stay. The broader question is how news organizations respond to them. For many, referrals from sites such as Facebook are critical to maintaining traffic levels, and bringing in both advertising revenue and potential subscribers.
But it’s a tough business, given online advertising rates, to build robust revenues from viral content. It’s hard to create original content at any kind of reasonable cost that you can be assured will bring in reasonable traffic, and hence online ad dollars. Even Buzzfeed, the current reigning king of virality, isn’t really in the online advertising space, despite its ability to generate huge amounts of traffic. As Felix Salmon notes:
Historically, media companies have been in the business of selling individuals to advertisers: you put together some kind of a product that people love, and then bundle that product with advertising. But BuzzFeed is different. It starts the same way, by building products that people love. But then, instead of inserting advertising into that product, it then sells advertisers its expertise at building such things.
In other words, it sells its know-how about virality, not the actual traffic.
But perhaps the smartest critique of the rush to chase Facebook-type traffic comes from Monday Note, where Frédéric Filloux dissects the economics of social traffic versus home page (and subscriber) traffic.
Roughly speaking, for a news, value-added type media, the number of page views by source goes like this:
Direct Access : 5 to 6 page views
Google Search: 2 to 3
Google News: ~1
These figures show how good you have to be in collecting readers from social sources to generate the same advertising ARPU as from a loyal reader coming to your brand because she likes it. Actually, you have to be at least six times better.
Which is another way of saying, social traffic is good, but it’s far more important to be good at getting people to come to your site and engaging them in an immersive experience. And to do that requires more than atomized content – ie, unconnected stories that readers can find across the web, no matter how viral – a problem that Nicholas Carr famously called “The Great Unbundling” a good many years ago.
(The NYT piece about Facebook references this issue as well, via an observation from Ben Smith, the editor in chief of Buzzfeed, who notes that he doesn’t want to see “filler” on his site:
News organizations that still publish a print edition, he said, have slots — physical holes on paper or virtual ones on a home page — that result in the publication of stories that are not necessarily the most interesting or timely, but are required to fill the space. It was partly to discourage such slot-filling that BuzzFeed did not focus on its home page when it first started, he said.)
That’s one way to approach news. Another, I’d argue, is to focus as much on the “news experience” as on the content you create – building a more immersive experience where content and exploration are tightly woven, such as in Connected China or Homicide Watch. (You knew I’d get to structured journalism at some point in this post.) They may not get massive amounts of traffic, but the people who do come tend to stay on longer, and engage more deeply with the content.
Perhaps more importantly – and somewhat old-fashionedly – it puts much more of the site’s future in your hands than in Facebook’s. And if thay doesn’t make you feel better, it should at least make your investors feel a teeny bit more secure. As Frédéric notes about the questions he has about a world where Facebook and Google dominate distribution of news:
The first concerns the intrinsic valuation of a media so dependent on a single distribution provider. After all, Google has a proven record of altering its search algorithm without warning. (In due fairness, most modifications are aimed at content farms and others who try to game Google’s search mechanism.) As for Facebook, Mark Zuckerberg is unpredictable, he’s also known to do what he wants with his company, thanks to an absolute control on its Board of Directors (read this Quartz story).
None of the above is especially encouraging. Which company in the world wouldn’t be seen as fragile when depending so much on a small set of uncontrollable distributors?