Even the language used to describe the attack before the fact framed it as an act of internet activism. In a post on 8chan, the shooting was referred to as a “real life effort post.” An image was titled “screw your optics,” a reference to a line posted by the man accused in the Pittsburgh synagogue shooting that later became a kind of catchphrase among neo-Nazis. And the manifesto — a wordy mixture of white nationalist boilerplate, fascist declarations and references to obscure internet jokes — seems to have been written from the bottom of an algorithmic rabbit hole.
It would be unfair to blame the internet for this. Motives are complex, lives are complicated, and we don’t yet know all the details about the shooting. Anti-Muslim violence is not an online phenomenon, and white nationalist hatred long predates 4Chan and Reddit.
But we do know that the design of internet platforms can create and reinforce extremist beliefs. Their recommendation algorithms often steer users toward edgier content, a loop that results in more time spent on the app, and more advertising revenue for the company. Their hate speech policies are weakly enforced. And their practices for removing graphic videos — like the ones that circulated on social media for hours after the Christchurch shooting, despite the companies’ attempts to remove them — are inconsistent at best.