The response to the 'Sokal Squared' hoax has been mixed

Wednesday I wrote about a trio of academics who wrote bogus papers and submitted them to supposedly serious journals to see if nonsense could be passed off as insight given enough earnest social justice framing of the ideas. What they found was that seven of the 20 bogus papers they wrote were accepted for publication. Apparently, the line between serious feminist scholarship and complete sophistry is pretty thin.

Dubbed “Sokal Squared” after a similar stunt by Alan Sokal in the late 90s, the hoax has generated a lot of feedback among academics. One of the most positive take came from Yascha Mounk, a Harvard lecturer, who praised the hoax on Twitter and wrote a piece about it for the Atlantic:

If certain fields of study cannot reliably differentiate between real scholarship and noxious bloviating, they become deeply suspect. And if they are so invested in overcoming injustice that they are willing to embrace rank cruelty as long as it is presented in the right kind of progressive jargon, they are worsening the problems they purport to address.

It would, then, be all too easy to draw the wrong inferences from Sokal Squared. The lesson is neither that all fields of academia should be mistrusted nor that the study of race, gender, or sexuality is unimportant. As Lindsay, Pluckrose, and Boghossian point out, their experiment would be far less worrisome if these fields of study didn’t have such great relevance.

Today, Slate published a piece taking the opposite position. According to author Daniel Engber, “the project fails to match its headline presentation.”

It’s true that Pluckrose, Lindsay, and Boghossian tricked some journals into putting out made-up data, but this says nothing whatsoever about the fields they chose to target. One could have run this sting on almost any empirical discipline and returned the same result. We know from long experience that expert peer review offers close to no protection against outright data fraud, whether in the field of gender studies or cancer research, psychology or plant biology, crystallography or condensed matter physics. Even shoddy paste-up jobs with duplicated images and other slacker fakes have made their way to print and helped establish researchers’ careers. So what if these hoaxers did the same for fun? These examples haven’t hoodwinked anyone with sophistry or satire but with a simple fabrication of results…

Even if we push the made-up-data papers to the side, those results are still quite grave: Twenty-five percent of bullshit papers made their way through peer review. But what, exactly, does it prove?

Engber has a point that the authors of the hoax papers had a specific target, i.e. mostly feminist journals, but I think he’s really stretching to assume that makes the author’s point invalid. One doesn’t have to aim at all disciplines equally in order to make a point about the problems within one. And one doesn’t need a rigorous experimental design to point out that some respected journals will publish anything that tickles their fancy. The Slate piece also overlooks the fact that several more papers would likely have been published eventually if the hoax hadn’t been outed early by journalists. The academics who wrote the papers estimate that, given a bit more time, 10-12 papers would have been accepted. That has to say something about the journals that accepted them.

I’ll let Bret Weinstein, who has faced the practical outcome of grievance studies as few other academics have, give the last word:

Trending on Hotair Video