Google: No, we're not cooking search-query autocompletes for partisan results

Google says no no no, and … they may well be telling the truth.  SourceFed offered up a video j’accuse Thursday that got enough attention from Google to prompt a flat denial — although, as you’ll see, they should have outsourced the effort to industry experts. Matt Lieberman compares the auto-fill suggestions given by Yahoo and Bing search engines to those produced by Google for queries about Hillary Clinton, and sees a rather dramatic difference.

Advertisement

Is this evidence that Google is cooking its responses to bolster Hillary Clinton’s presidential chances? Or did SourceFed cherry-pick the query types to get these dramatic differences? How does the Autocomplete functions differ at the major search sites … and why would an autofill function be the target of manipulation at all?

https://www.youtube.com/watch?v=PFxFRqNmXKg

If true, Lieberman argues, it betrays the relationship between Google and its consumers. “I no longer have the same confidence” in the system, Lieberman says, and calls this a serious ethical breach that even Google’s employees would find shocking and disturbing.

That brings us to Google’s response, issued yesterday:

Google Autocomplete does not favor any candidate or cause,” said a Google spokesperson in an email to the Washington Times.

“Claims to the contrary simply misunderstand how Autocomplete works. Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person’s name. More generally, our autocomplete predictions are produced based on a number of factors including the popularity of search terms,” said the statement.

This response is so generic and vague that it’s easy to dismiss it as corporate-speak. However, others offered more extensive and perhaps convincing explanations of what SourceFed found. CNN’s David Goldman reported that this is nothing more than an indication of Google’s superior algorithms, which are intended to screen out false information from Autocomplete. He links to an essay from Rhea Drysdale, a CEO of a search-engine optimization company, claiming that SourceFed cherry-picked the examples:

Advertisement

The examples that SourceFed chose are factually incorrect. Hillary Clinton has not been charged with a crime. She has not been indicted. Google (GOOGL, Tech30) knows this, and its algorithm actually filters out inaccurate information in autocomplete.

“Our autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person’s name,” a Google spokeswoman said. “Google autocomplete does not favor any candidate or cause. Claims to the contrary simply misunderstand how autocomplete works.”

To counter SourceFed’s claim, Drysdale showed similar results for Donald Trump, in which “Donald Trump lawsuits” did not show up in autocomplete results when entering “Donald Trump la” into Google. But “Donald Trump laughing” did, despite the fact that far more people are searching about the presumptive Republican nominee’s legal battles. …

By typing in just “Hillary Clinton,” Google presents plenty of autocomplete suggestions with negative connotations, including “email” and “Benghazi.”

Searches for those two terms are way more popular than either of the cherry-picked searches that SourceFed included in its video. Google understands that “Hillary Clinton email” and “Hillary Clinton Benghazi” are synonymous with potential criminal charges or indictments, Drysdale said.

Advertisement

Be sure to read Drysdale’s entire post, which has plenty of its own screenshots to back up her claims. Why do searches in Bing and Yahoo produce identical and different autocomplete suggestions? Drysdale explains that the algorithms in use for both sites are less complex and more literal than Google’s. “I’ve been getting paid to manipulate Google’s search results for years,” Drysdale says in her angry rebuttal to SourceFed, and knows its operations and limitations. In response to SourceFed’s conspiracy-tinged accusations, Drysdale makes one of her own:

Because SourceFed told you to look up these queries, they’ve just manipulated Google’s search results.

Think about that for a minute. Google Autocomplete is powered by user behavior, personalization, trends, and lots of other factors. By telling hundreds of thousands of people (and growing) to search for these queries, SourceFed has just sent Google data supporting a massive spike of interest in these terms.

It’ll be very interesting to see what happens with these queries from here.

As someone who has been paid to to manage online reputations and displace negative Google search results for years, I have to wonder if there was a different motivation behind this video, because it was either very poorly done or very strategically executed. Whatever the reason, I hope if you’ve read this far you now have a better understanding of how Google Autocomplete works and that this has absolutely nothing to do with favoring anyone.

Advertisement

That also seems a bit far-fetched, but it may have had its intended impact. SourceFed and Lieberman responded to this with an explanation of why and how it produced the video. While promising a more substantive follow-up next week, one does get a hint of a possible walk-back in the midst of an entertaining if self-serving narrative of the impact criticism has had on their effort:

There are three possibilities. SourceFed could have stumbled onto bias from Google, or it didn’t take the time to properly research the potential reasons for these differences, or … it wanted to launch an attack on Google on behalf of those opposed to Hillary. The second option seems much more likely than the other two, especially given how esoteric this function is.

That brings me to this questuion: why? All due respect to Lieberman’s research on behavioral impact from online search results, that’s not what we’re discussing. The autocomplete function merely assists the entering of search criteria; it doesn’t force the user to use one of the suggestions. I suspect most people don’t feel themselves limited or persuaded by autofill functions, but would proceed to launch the search they actually intended to conduct. If Google really wanted to manipulate the search process, it would aim at the results … and nothing in either video provides any evidence of manipulation in that function, partisan or otherwise.

Advertisement

Either way, there are thankfully a number of options for online searches. Perhaps people should spread their efforts across all of them as a matter of course anyway.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement
Advertisement