Wow: Facebook employees asked Mark Zuckerberg whether they should try to stop Trump

Why would Mark Zuckerberg want to stop Trump? He wants a president who’ll increase H-1B visas, doesn’t he?

Kidding aside, this is ominous.

Every week, Facebook employees vote in an internal poll on what they want to ask Zuckerberg in an upcoming Q&A session. A question from the March 4 poll was: “What responsibility does Facebook have to help prevent President Trump in 2017?”

A screenshot of the poll, given to Gizmodo, shows the question as the fifth most popular…

Facebook has toyed with skewing news in the past. During the 2008 presidential election, Facebook secretly tampered with 1.9 million user’s news feeds. An academic paper was published about the secret experiment, claiming that Facebook increased voter turnout by more than 340,000 people. In 2010, the company tampered with news feeds again. It conducted at 61-million-person experiment to see how Facebook could impact the real-world voting behavior of millions of people. In 2012, Facebook deliberately experimented on its users’ emotions. The company, again, secretly tampered with the news feeds of 700,000 people and concluded that Facebook can basically make you feel whatever it wants you to.

If Facebook decided to, it could gradually remove any pro-Trump stories or media off its site—devastating for a campaign that runs on memes and publicity.

Advertisement

They’re a private company, of course, and can suppress whatever speech they like, although Eugene Volokh noted to Gizmodo that actual collusion with a candidate might be deemed a contribution — a big one — for purposes of campaign-finance law. Nothing’s stopping them from saying “no Trump material on our site,” though. Or rather, nothing’s stopping them legally. There would, of course, be a market backlash among Trump fans if Facebook did something like that … unless they secretly censored all (or most) of the Trump material without telling anyone. Imagine if Facebook quietly instituted a policy in which users could post pro-Trump material but only, say, 20 percent or so of their actual postings would be visible to “friends.” You’d never know that your posts weren’t being viewed, and the fact that you yourself would be able to view *some* pro-Trump posts by other people would convince you that nothing untoward is afoot. As Gizmodo notes, they’ve gamed people’s feeds before to try to tweak election results. Their reach is so fantastically huge that they could, in theory, tilt an election in one or more swing states simply by encouraging particular demographics to vote.

And if you think Facebook’s reach is long, imagine what Google could do. If you missed it last August, enjoy Robert Epstein’s piece, “How Google Could Rig the 2016 Election.” The same principles apply: People respond behaviorally to well-placed “nudges,” and there’s no better placement than on the first page of Google’s search results. We’re stupid creatures with short attention spans. Lay a suggestion right in front of us when we’re trying to make a decision and many of us will leap at it.

Advertisement

Google’s search algorithm can easily shift the voting preferences of undecided voters by 20 percent or more—up to 80 percent in some demographic groups—with virtually no one knowing they are being manipulated, according to experiments I conducted recently with Ronald E. Robertson .

Given that many elections are won by small margins, this gives Google the power, right now, to flip upwards of 25 percent of the national elections worldwide. In the United States, half of our presidential elections have been won by margins under 7.6 percent, and the 2012 election was won by a margin of only 3.9 percent—well within Google’s control.

Google denies that they’d ever tamper with organic search results, of course. Meanwhile, Epstein’s research has found that the political demographic in the U.S. that’s most susceptible to suggestion via Google is, er, moderate Republicans, 80 percent of whom could be nudged in one direction or another politically according to an experiment. That’s the group you’d target if you were looking to tank an election for Democrats, and coincidentally it’s the group Hillary Clinton will be targeting this fall to reject the GOP’s “radical” nominee, whoever that may be. Get moderate GOPers to stay home or switch sides, especially in purple states, and Republicans are cooked. Google and Facebook could help a lot. And here’s Gizmodo telling you that the idea of stopping Trump has already been broached within Facebook.

Advertisement

The big deterrent to tech giants manipulating voter behavior, I assume, is exactly what’s happened today, which is that it might leak. If Facebook wants to tank positive references to Trump without anyone finding out, they’d need to keep that information closely held. Presumably only a few staffers would know; somehow they’d have to impose a filter on postings without any GOP-friendly employees there finding out. If it did leak and the public had to confront the fact that national elections are now to some extent just mazes being run by rats to achieve Facebook’s desired outcome, the backlash would be nasty. Republican users would flee. Congress would start howling about antitrust laws. It’s a cinch that new regulations would be proposed attempting to limit the companies’ political influence. The First Amendment would bar any law requiring them to be evenhanded in providing a platform to political candidates, but public pressure might convince Facebook, Google, etc to submit “voluntarily” to some new watchdog that would police the algorithms to make sure no deliberate manipulation was going on. Whatever the outcome, it’d be darkly ironic, at a moment when the left’s lecturing mom-and-pop bakeries that public accommodations may not discriminate, if a business with massively more reach tipped a national election to the left because the owners hate Republicans.

Advertisement

Facebook responded to the Gizmodo piece this afternoon by encouraging fans of all candidates to use their platform and promising that “we have not and will not use our products in a way that attempts to influence how people vote.” You trust this guy, right?

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Stephen Moore 8:30 AM | December 15, 2024
Advertisement
Advertisement