Facebook had announced it will begin adding warning labels to “fake news” on its site. The plan is to add a link to stories which readers have identified as fake. The link will cite fact-checkers explaining why a story is false or fake. Facebook described the proposed changes in a blog post:

We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully. We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations…

We’re testing several ways to make it easier to report a hoax if you see one on Facebook, which you can do by clicking the upper right hand corner of a post. We’ve relied heavily on our community for help on this issue, and this can help us detect more fake news.

We’ve started a program to work with third-party fact checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why. Stories that have been disputed may also appear lower in News Feed.

Cue the outrage mobs targeting stories or websites they don’t like as “fake news.” But Facebook users’ reporting of “fake news” will be backed up by the expertise of fact-checkers. Clicking over to Poynter’s list of signatories you find the fact-checkers Facebook will rely on in the U.S. are ABC News, the AP, FactCheck.org, the Washington Post, Snopes and Politifact.

There certainly is a lot of junk on Facebook that I wouldn’t defend and won’t miss. That said, the idea of Snopes and Politifact controlling the distribution of news online seems like a truly terrible idea. Just to take one significant example of why it’s a bad idea, here’s how Politifact rated Obama’s statements ‘If you like your plan you can keep it’ between 2008 and 2013.









There are a couple points to make about this sequence. The first is that it’s not just tinfoil hat conspiracy theories from fringe websites that are going to be impacted by Facebook’s decision to put a scarlet letter on certain news stories. It’s also major stories that are central to our political debate, like the one above. This was President Obama’s go-to sales line for his signature achievement in office.

The second point here is that fact-checkers get it wrong sometimes. Not only was “If you like your plan…” important to the national debate it was also hard to pin down because of the grand scope of the change being instituted. Obamacare was complex enough (and far off enough) that it was possible to argue Obama was right…until it became clear he wasn’t.

Third point: Sometimes the experts are also partisans who have an agenda. That was certainly the case with regard to Obamacare. Health care wonks like Ezra Klein and Jonathan Gruber knew a great deal about the program. They were also prepared to help their Democratic allies in government lie to the public if necessary to see it succeed. It’s not that they didn’t know the truth it’s just that they weren’t going to share all of it (except occasionally to a friendly audience).

Now imagine applying these new rules retroactively to this story. Would any story which challenged Obama’s statement be flagged as “fake news” prior to 2013? And not only flagged, it seems Facebook would discourage people from sharing it and some algorithm would ensure it appeared lower in the news rankings. The bottom line is that a “fake news” designation could suppress stories that later turn out to be true, possibly even for years. This is just one example but it’s a pivotal one.

Maybe Facebook will clear out a bunch of “fake news” with this new process but it’s also very likely going to taint and hamper some important (and true) news stories. Fact-checkers do miss the mark sometimes and experts aren’t always completely straight with the public. But what’s really worrisome is that this new dynamic creates fresh opportunity for unscrupulous politicians who are good at controversializing stories they want to go away. It also could help in the creation of media echo chambers designed to support certain policies. Has Facebook thought through any of this? It’s not clear that they have.