I’ve already read what Taylor had to say about Alex Jones being ejected from most social media platforms, and what Allahpundit thought about it as well. No matter how many times I go over this in my head, I’m left unsatisfied with the answers and solutions being put forward.

A couple of other voices are weighing in this week with what at least seems to be a bit more nuanced approach. Even if I still don’t entirely agree, these are arguments worth considering. One is found in an essay by David French at the New York Times. This is followed up on in a response from Jonah Goldberg at National Review.

French makes the argument that there’s clearly room for a certain amount of supervision and boundaries in terms of what gets posted on social media. But the trick is to avoid imbalance in enforcement by restricting the ban hammer or other content moderation to things already covered under the law, primarily libel and slander.

There are reasons to be deeply concerned that the tech companies banned Alex Jones. In short, the problem isn’t exactly what they did, it’s why they did it.

Rather than applying objective standards that resonate with American law and American traditions of respect for free speech and the marketplace of ideas, the companies applied subjective standards that are subject to considerable abuse. Apple said it “does not tolerate hate speech.” Facebook accused Mr. Jones of violating policies against “glorifying violence” or using “dehumanizing language to describe people who are transgender, Muslims and immigrants.” YouTube accused Mr. Jones of violating policies against “hate speech and harassment.”

Goldberg finds great appeal in this argument (the title of his piece is “Three Cheers for David French”) and I’ll grant that there’s something attractive about the concept. Jonah slips back into a portion of the debate we’ve had here before over whether or not Facebook, Twitter and YouTube need to be treated as public utilities, pointing out the hypocrisy of net neutrality supporters who egged on Facebook and the rest of the social media platforms to give Jones the boot.

But part of the problem is that platforms such as Google, YouTube, Twitter, Facebook, etc. operate almost like public utilities. Indeed that’s one of the ironies about the battle lines drawn over Alex Jones. As a broad generalization, the people who loved net neutrality, precisely because they want the Internet to be like a public utility, cheered Big Internet for banning Jones from its platforms. Meanwhile, many of the people who hated net neutrality were outraged by the idea that private companies could “censor” voices they didn’t like. A real public utility can’t deny services to customers just because it doesn’t like what they say or think.

The number of problems with the entire “Facebook should be handled like a public utility” concept and the many ways the analogy falls apart are too numerous to go into here yet again. But with that said, I’ll agree with Jonah’s point that the electric company doesn’t come over and cut off your power because you peddle theories about 9/11 or tell people that Jews control the weather. So let’s come full circle back to the concept that restrictions on users of social media can and should be limited to content which violates accepted laws covering libel and slander.

This seems to be a tidy solution at first glance because it gets rid of Alex Jones (assuming he’s found guilty of the charges being brought against him by the Sandy Hook families) but leaves your favorite, crazy Twitter follows from both sides of the aisle free to continue pumping out their thoughts on full blast every hour of the day. Jonah refers to the solution as having eliminated the “eye of the beholder” problem. But before we begin popping open the champagne bottles to celebrate a fine day’s work, would someone mind pointing out for me the person or persons at each of these social media companies who are going to be determining which updates, tweets and videos rise to the level of libel or slander? Even the courts are frequently unable to agree on such questions. That’s why there are so many cases heard every year and such a mixed bag of results. It’s one of the worst of the gray areas under the law.

Also, when somebody is accused of such crimes, they have a chance to defend themselves in a court of law so something approaching a fair verdict can be reached. It sounds to me like we’re still talking about allowing people at these companies to make the decision on their own and there will be no appeal. And, again, who will be making the call? We’re talking about companies which many of you have already argued with credible evidence are heavily biased in favor of liberal ideology. You don’t suppose that the bar for libel and slander might be set just a bit higher when the invective is being hurled at a conservative, do you?

Even if we somehow envision a set of fair judges at each company making these decisions, none of this gets us past the corkboard problem which I’ve brought up here repeatedly. There are an estimated 350 million Facebooks status updates and photos posted every single day. The daily tweet count is in excess of half a billion. Who, pray tell, is going to review them all to make sure they pass muster? It’s flatly impossible. So I suppose somebody will have to develop an app to scan for keywords and automate the process. Oh, wait… they already did that. But who developed the app and which words are naughty and which are nice? You see the problem, right?

That brings us back to the second half of the corkboard issue. Goldberg perhaps inadvertently touches on it when he says this: “I certainly have no problem with private entities — including corporations such as Google and Apple, but also every journalistic enterprise — using their own judgment about what kind of speech they will publish or associate with.”

If we accept this thought, we accept the idea that Twitter, Facebook and YouTube are “publishers” and therefore somehow responsible for the content which appears on their platforms. Jonah makes the comparison of saying National Review would almost certainly never publish an editorial by Alex Jones, but they have the right to do so if they wish. That’s exactly the point. The people at NR are publishers. They have editors. They have professionals making conscious decisions about each and every character which is printed in their magazine or shows up on their website. The social media giants are not publishers. Their users are. And if you’re going to hold anyone accountable, it should be the author/publisher of the offending material.

So I’m afraid we still don’t have a solution. If banning Alex Jones was acceptable, then you pretty much have to accept that Twitter suspending Kat McKinley was acceptable too. And when you disagree with that, we’re right back where we started, arguing over definitions and gray areas with tech company giants who are accountable to no one for their decisions.