On Tuesday this week, the European Commission announced its first list of designated Very Large Online Platforms – or VLOPs – that will be subject to “content moderation” requirements and obligations to combat “disinformation” under the EU’s Digital Services Act (DSA). As VLOPs, the designated services will be required “to assess and mitigate their systemic risks and to provide robust content moderation tools.”
Or as a subheading in the Commission announcement pithily puts it: “More diligent content moderation, less disinformation.”
As discussed in my previous articles on the DSA here and here, the legislation creates enforcement mechanisms – most notably, the threat of massive fines – for ensuring that online platforms comply with commitments to remove or otherwise suppress “disinformation” that they have undertaken in the EU’s hitherto ostensibly voluntary Code of Practice on Disinformation.
Unsurprisingly, the list of designated VLOPs includes a variety of services offered by all the most high-profile signatories of the Code: Twitter, Google, Meta, Microsoft, and TikTok.
Join the conversation as a VIP Member