I’ve really come to dislike the so-called “tech reporters” employed by supposedly serious news outlets who, for starters, don’t know much of anything about tech in any useful sense. Their beat seems to range between lionizing the latest progressive influencers various online platforms to warning us that social media, especially Facebook and YouTube, will be the death of all of us. Freedom is a mess it turns out and some people want those in charge to clamp down on the freedom and create more order.
The trend toward apocalyptic exaggeration about social media may have reached its zenith yesterday in a piece published by the Atlantic. It’s titled “Facebook Is a Doomsday Machine” and despite the fact that Facebook is not a doomsday machine in any meaningful way, the author insists on the idea that, seriously guys, we’re all going to die because of Facebook.
I’ve been thinking for years about what it would take to make the social web magical in all the right ways—less extreme, less toxic, more true—and I realized only recently that I’ve been thinking far too narrowly about the problem. I’ve long wanted Mark Zuckerberg to admit that Facebook is a media company, to take responsibility for the informational environment he created in the same way that the editor of a magazine would. (I pressed him on this once and he laughed.) In recent years, as Facebook’s mistakes have compounded and its reputation has tanked, it has become clear that negligence is only part of the problem. No one, not even Mark Zuckerberg, can control the product he made. I’ve come to realize that Facebook is not a media company. It’s a Doomsday Machine…
Limitations to the Doomsday Machine comparison are obvious: Facebook cannot in an instant reduce a city to ruins the way a nuclear bomb can. And whereas the Doomsday Machine was conceived of as a world-ending device so as to forestall the end of the world; Facebook started because a semi-inebriated Harvard undergrad was bored one night. But the stakes are still life-and-death. Megascale is nearly the existential threat that megadeath is. No single machine should be able to control the fate of the world’s population—and that’s what both the Doomsday Machine and Facebook are built to do.
Very quickly we learn that the litany of horrors on Facebook starts with the horrible things that people upload to Facebook:
Facebook has enlisted a corps of approximately 15,000 moderators, people paid to watch unspeakable things—murder, gang rape, and other depictions of graphic violence that wind up on the platform. Even as Facebook has insisted that it is a value-neutral vessel for the material its users choose to publish, moderation is a lever the company has tried to pull again and again. But there aren’t enough moderators speaking enough languages, working enough hours, to stop the biblical flood of shit that Facebook unleashes on the world, because 10 times out of 10, the algorithm is faster and more powerful than a person. At megascale, this algorithmically warped personalized informational environment is extraordinarily difficult to moderate in a meaningful way, and extraordinarily dangerous as a result.
Granted these are horrible things and it’s bad news that some sick individuals consider them worthy of sharing and/or consuming. If Facebook didn’t exist there would be less opportunity to spread this stuff but also less opportunity to spread everything else that exists online including a whole lot of stuff that is helpful (current information from around the world plus an online library of all kinds of specialized information (Need to know how to repair your particular brand and model of washing machine? There’s a YouTube clip somewhere showing you how)). Lots of the stuff that gets attention online is basically harmless entertainment (like Tik Tok creators or make-up tutorials). Occasionally, (okay frequently) someone says something dumb or offensive but as John Lennon might have said ‘It’s nothing to get hung about.’
The author eventually admits that things seem to be even worse in the unmoderated cesspits of 4chan and 8kun. But she quickly moves on, quoting an alleged expert who claims Facebook may be worse: “The idea of a free-for-all sounds really bad until you see what the purportedly moderated and curated set of platforms is yielding … It may not be blood onscreen, but it can really do a lot of damage.”
What damage? We’re mostly allowed to imagine the terrible outcome that awaits. It’s something about Facebook controlling us all but how that will happen exactly is never spelled out. We do however get vague solutions to the problem the author is sure exists:
If the age of reason was, in part, a reaction to the existence of the printing press, and 1960s futurism was a reaction to the atomic bomb, we need a new philosophical and moral framework for living with the social web—a new Enlightenment for the information age, and one that will carry us back to shared reality and empiricism.
The most published book since the invention of the printing press is the Bible. And printing not only brought us daily news it also brought us yellow journalism and propaganda and, it must be said, a lot of great literature much of which is not about shared reality or empiricism. How many people have read The Fellowship of the Ring? I’m sure its a great many more than have read Newton’s Mathematica Principia. And that’s okay and not a reason to panic about the printing press.
The same applies to social media. Most of it is nonsense and some of it is repulsive and potentially harmful but going into a panic won’t help. We need to learn to deal with out new global communication abilities. It’s all very new and we’re really not doing that bad so far.