Facebook’s decision to pull down a satire piece by the Babylon Bee yesterday worried me. As I pointed out here, I wasn’t concerned that we would be deprived of the content. The Bee’s story was just an old skit from Monty Python’s Holy Grail turned into a gag about Sen. Hirono wanting to apply the witch test to Judge Amy Coney Barrett. As you can see from the image, it couldn’t have been more obvious this was all a joke. So what worried me wasn’t the loss of the content but the fact that Facebook was so obviously telling us that they didn’t care about applying their own rules fairly.
The message they were sending by pulling down this obvious joke seemed to be: We’ll do whatever we want to whoever we want. And as we’ve seen with regard to the NY Post story about Hunter Biden this week, that’s a worrisome approach for giant social media companies to take. It effectively means there are no rules, just rulers acting on their own impulses.
But late last night Facebook had a change of heart (2nd email below):
Update: Now that this story has attracted some media attention, Facebook has reversed their decision and lifted the restrictions on our page. They now say it was just a mistake and blame it on their automated systems (even though a person manually reviewed it after our appeal).
— Seth Dillon (@SethDillon) October 22, 2020
As you can see, Facebook is now claiming this was a mistake caused by “automated systems.” That completely contradicts what the Bee had been told earlier in the day, i.e. that a manual review by a person had been done and that the decision would stand because the story “incites violence.”
I emailed Seth Dillon about the manual review yesterday. He said the response they got was pretty vague but that it had been suggested to them that the line “We must burn her!” was part of the problem. So as I speculated yesterday, Facebook was apparently claiming that this satire piece was a threat to Judge Barrett. At least that’s how it looks to me.
At this point one of two things are true. Either Facebook is telling the truth about this being a failure of an automated system, in which case it misled the Bee yesterday when it told them a person had reviewed the situation. Or Facebook is lying about this being the result of an automated system to cover up poor decision making by some staffer or contractor. Either way, this doesn’t look very good for Facebook.
Maybe an automated system initially flagged this but I think the correct answer is the second one. That’s bad because it once again demonstrates that Facebook is making up the rules as they go. You really have to wonder if this hadn’t gotten some media attention yesterday if they would have corrected their mistake. My guess is they would not have. So I guess the lesson here is that it pays to push back on Facebook’s impulse to pull down whatever it wants because, at least some of the time, that leads to second thoughts.
If only Monty Python were still around to make a skit about Facebook taking down one of their own skits on the grounds that it represented an incitement to violence. I’m sure it would be hilarious.
Join the conversation as a VIP Member