Facebook as a “fake news” police force? Fuhgeddaboudit. Appeasing the social panic, largely driven by media outlets, will only make the social-media platform look either less relevant, less democratic, or likely both. And Facebook knows it — or at least some of the executives do. After founder Mark Zuckerberg reversed himself and announced some new efforts to filter its news feed and add credibility ratings for sources, COO Sheryl Sandberg offered a cold dose of reality about its impact:
“I don’t think we have to be the publisher and we definitely don’t want to be the arbiter of the truth,” Sandberg said. “We don’t think that’s appropriate for us. We think everyone needs to do their part. Newsrooms have to do their part, media companies, classrooms and technology companies.”
“Well, we all have to do our part to make sure that people see accurate information and figuring out how we do that is something that we’re going to have to see and will evolve. But we know the goal, the goal is for people to see accurate information on Facebook and everywhere else.”
The “fake news” hysteria began with an attack on Facebook’s news feeds, one that attempted to draw conclusions despite an utter lack of evidence that it had any impact at all on voter behavior. The Buzzfeed article which launched the social panic over social media didn’t even posit a correlation, let alone causation, but Democrats from Barack Obama on down and nearly every national media outlet painted “fake news” as the reason that Donald Trump won the election. It seems odd that Facebook itself would cooperate with that take on its social-media platform, especially since “fake news” is hardly a new phenomenon, nor limited to online media at all.
In my column today for The Week, I warn Facebook about the consequences of setting themselves up as the media police, and remind them that consumers have been handling that quite well for themselves for a very long time:
Sandberg leaves out an important player in that equation: the consumer. If consumers want better news, then they need to seek it out. Consumers should not rely on a community-driven news feed for their information, but instead seek out original sources, determine which they can trust, and then verify information before sharing it.
This is not a new problem, and it didn’t originate with Facebook. Before the advent of social media, email was the favored medium for fever-swamp claims and conspiracy theories. Since the advent of the internet, websites of varying degrees of sophistication have amplified the silly and the serially stupid. Before that, supermarket tabloids specialized in what’s now called fake news — and both still operate as distribution channels for fake news to this day, in the same manner as social media.
And yet, America still held elections over the last few decades with all of these sources of fake news, and did so successfully. Why? Because despite the attempts to paint the U.S. electorate as a bunch of unsophisticated hicks, most adults have no problem distinguishing fake news from the real thing. Voters have more resources than ever to help them consume news responsibly. They don’t need Facebook to pre-digest their news and then spoon-feed it to them.
Facebook is a private-sector, voluntary-association community, and they can set up their system as they like. If they want to police news stories and block access to some based on their own assessment of credibility, that’s their choice. It comes, however, at the expense of choice among their members, and in the most paternalistic manner conceivable.
How well has the paternalistic, top-down, elite-endorsement model worked out? Just ask the Democrats. To be fair, they’re so caught up in the “fake news” explanation that they haven’t figured out that it’s what cost them the House, the Senate, the White House, and most state legislatures over the last eight years. If Facebook wants to follow that model, let’s hope they’re prepared for that outcome, too.
Join the conversation as a VIP Member