Facebook tried to limit QAnon. It failed.

Since then, a militia movement on Facebook that called for armed conflict on the streets of U.S. cities has gained thousands of new followers. A QAnon Facebook group has also added hundreds of new followers while questioning common-sense pandemic medical practices, like wearing a mask in public and staying at home while sick. And a campaign that claimed to raise awareness of human trafficking has steered hundreds of thousands of people to conspiracy theory groups and pages on the social network.

Advertisement

Perhaps the most jarring part? At times, Facebook’s own recommendation engine — the algorithm that surfaces content for people on the site — has pushed users toward the very groups that were discussing QAnon conspiracies, according to research conducted by The New York Times, despite assurances from the company that that would not happen.

None of this was supposed to take place under new Facebook rules targeting QAnon and other extremist movements. The Silicon Valley company’s inability to quash extremist content, despite frequent flags from concerned users, is now renewing questions about the limits of its policing and whether it will be locked in an endless fight with QAnon and other groups that see it as a key battleground in their online war.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement