Britain is Trying to Censor Americans – But America is Fighting Back

Ofcom has confirmed it is referring 4chan to a final enforcement decision under the Online Safety Act. The target is a Delaware company that runs an entirely anonymous imageboard from the United States, with no offices, staff, servers or assets in Britain. The demand: install age-verification systems and content filters so that British children cannot access the site or face daily fines levied from London on an American platform. This case is not an outlier. It is the clearest real-world demonstration of what the new generation of “online safety” laws requires: private companies must build automated filters that decide, in advance, which legal speech is too harmful for minors to see. The question the regulators never quite answer is simple: what exactly does the filter catch?

Advertisement

In the early 2020s, a political consensus formed on both sides of the Atlantic: social media is harming children and something must be done. The result in Washington was the Kids’ Online Safety Act (KOSA); in Westminster, the Online Safety Act (OSA), which received Royal Assent in October 2023 and began enforcement in 2025. The political appeal of both measures is genuine. Adolescent mental health deteriorated in the 2010s, parents are alarmed and platforms have appeared indifferent. But good intentions do not make good law, and the form these interventions took is constitutionally and morally indefensible. Both KOSA and the OSA rest on a duty-of-care model: platforms must take “reasonable measures” or implement “proportionate systems” to prevent minors from encountering content associated with depression, anxiety, eating disorders, self-harm and suicide. This is not a regulation of conduct. It is a mandate to suppress speech based on its topic and its predicted emotional effect on a reader: the very definition of content-based regulation.

The American Civil Liberties Union (ACLU) stated the constitutional problem plainly in its July 2023 letter opposing KOSA: the bill “is a content-based regulation of constitutionally protected speech” that “will silence important conversations, limit minors’ access to potentially vital resources and violate the First Amendment”.  Under Reed v. Town of Gilbert, a law is content-based if it “applies to particular speech because of the topic discussed or the idea or message expressed”. Content-based regulations are “presumptively unconstitutional”.

Advertisement

The ACLU identified three specific constitutional failures. First, the speech targeted is protected. The Supreme Court has never permitted government to suppress legal speech simply because a legislature finds it unsuitable for children. In Brown v. Entertainment Merchants Association, the Court was unambiguous: “Speech that is neither obscene as to youths nor subject to some other legitimate proscription cannot be suppressed solely to protect the young from ideas or images that a legislative body thinks unsuitable for them.” Creating a “wholly new category of content-based regulation” permissible only for speech directed at children would be “unprecedented and mistaken”. Second, these regimes fail strict scrutiny because they are not premised on demonstrated causation. As the ACLU wrote, KOSA “is not premised on a direct causal link, but instead is based on correlation, not evidence of causation”. This is a decisive legal and moral point. In Brown, the Court struck down California’s video game restriction on exactly the same grounds: the state had produced only correlative data. A law that restricts the speech of millions of people must show that the restriction will actually prevent the harm it identifies. Neither KOSA nor the OSA can clear that bar. Third, these regimes are both under- and over-inclusive. They leave news media, books, music and magazines entirely unregulated while targeting social media platforms. And they will, inevitably, sweep up beneficial speech alongside harmful speech: 92% of parental control apps have been found to incorrectly block LGBTQ+ content and suicide-prevention resources alongside material that is genuinely harmful. Congress, the ACLU concluded, may not rely on unproven future technology to save the statute.

Advertisement

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement