Facebook rates users' trustworthiness in an attempt to stop no-platforming behavior

This previously unreported rating system was developed as part of Facebook’s effort to combat fake news. Facebook gives users the option to report something they believe to be false. However, the company quickly found that a lot of people were abusing the system as just another way to carry out partisan warfare. Facebook wanted some way to sort out who was actually reporting fake stories and who was simply reporting anything that offended their sensibilities whether it was true or not. From the Washington Post:

Advertisement

It’s “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” [Facebook produce manager Tessa[ Lyons said…

The system Facebook built for users to flag potentially unacceptable content has in many ways become a battleground. The activist Twitter account Sleeping Giants called on followers to take technology companies to task over the conservative conspiracy theorist Alex Jones and his Infowars site, leading to a flood of reports about hate speech that resulted in him and Infowars being banned from Facebook and other tech companies’ services. At the time, executives at the company questioned whether the mass reporting of Jones’s content was part of an effort to trick Facebook’s systems. False reporting has also become a tactic in far-right online harassment campaigns, experts say...

In 2015, Facebook gave users the ability to report posts they believe to be false. A tab on the upper right-hand corner of every Facebook post lets people report problematic content for a variety of reasons, including pornography, violence, unauthorized sales, hate speech and false news…

“I like to make the joke that, if people only reported things that were false, this job would be so easy!” said Lyons in the interview. “People often report things that they just disagree with.”

I’ve highlighted that one sentence above because it seems like a tell. The Washington Post wanted to make sure you knew it wasn’t just the left that’s making false reports so they added this caveat. Notice that it doesn’t even say this is happening at Facebook per se, only that it’s happening.

Advertisement

I’m sure there are some people on the right making these kinds of false reports, but I suspect that’s less common. Why? Becuase the far left routinely equates speech with violence and consequently believes in no-platforming those they disagree with. That’s exactly what this form of gaming the Facebook reporting system is all about: Silencing opponents.

Over the weekend I wrote about Facebook’s apology for pulling down some Prager U videos. Facebook restored the videos, apologized and said it was investigating what happened, which suggested it didn’t know. I didn’t quite buy that.

I don’t believe for a minute that Facebook is confused about what happened here. My best guess is that a bunch of progressive trolls reported the videos and they were pulled down by an algorithm. Eventually, a real person looked at the situation and realized the complaints were nonsense. So why not just admit that’s what happened? Why do we have to play this game where they are continuing to look into it?

If I had to guess why Facebook isn’t just coming clean here, it’s that they don’t want to encourage 10,000 more trolls to use the same tactics, creating a big mess for the company to clean up and a PR nightmare.

In theory, this 0 to 1 trustworthiness rating system could be used to prevent incidents like this in the future. People who report videos because they disagree with the content would gradually be ignored, limiting their ability to no-platform others. That seems like a good thing. The only alternative is to let the digital no-platforming mobs run wild.

Advertisement

Is it possible that this system could be gamed in some way that further harms conservative content online? I suppose anything is possible but, based on the description, it sounds to me like this particular feature is going to wind up hurting the censorious left far more than it harms the right. But given that this is Facebook, I don’t blame people for being suspicious about the company’s motives.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Ed Morrissey 10:00 PM | November 22, 2024
Advertisement
Advertisement