Fake News and Google: What Does a Top Google Search Result Really Mean?

Controversy continues to rage over how Holocaust denial sites and related YouTube videos have achieved multiple top and highly-ranked search positions on Google for various forms and permutations of the question “Did the Holocaust really happen?” — and what — if anything — Google intends to ultimately do about these outright, racist lies achieving such search results prominence.

If you’re like most Internet users, you’ve been searching on Google and viewing the resulting pages of blue links for many years now.

But here’s something to ponder that you may not have ever really stopped to think about in depth: What does a top or otherwise high search result on Google really mean?

This turns out to be a remarkably complex issue.

The ranking of search results is arguably the most crucial aspect of core search functionalities. I don’t know the details of how Google’s algorithms make those determinations, and if I did know I couldn’t tell you — this is getting into “crown jewel” territory. This is one of Google’s most important and best kept secrets.

It’s not just important from business and competitive aspects, but also in terms of serving users well.

Google is continually bombarded by folks trying to use all manner of “dirty tricks” to try boost their search ranks and visibility — in the parlance of the trade, Black Hat SEO (Search Engine Optimization). Not all SEO per se is evil — simply having a well organized site using modern coding practices is essentially a kind of  perfectly acceptable and recommended “White Hat” SEO.

But if details of Google’s ranking algorithms were known, it could theoretically help underhanded players use various technical tricks to try “game” the system to achieve fraudulently high search ranks.

It’s crucial not to confuse search links that are the results of these Google algorithms — technically termed “organic” or “natural” search results — with paid ad links that may appear above those organic results. Google always clearly marks the latter as “Ad” or “Sponsored” and these must always be considered in the context of being paid insertions that are dependent on the advertisers’ continuing ability to pay for them.

Until a relatively few years ago, Google’s organic search results always represented “simply” what Google felt were the “best” or “most relevant” link results for a given user’s query.

But the whole situation became enormously more complex when Google began offering what it deemed to be actual answers to questions posed in some queries, rather than only the familiar set of links.

In simple terms, such answers are typically displayed above (and/or to the right) of the usual search result links. These can come from a wide variety of sources, often related to the top organic search result, with one prominent source being Wikipedia.

Google’s philosophy about this — repeatedly stated publicly — is that if a user is asking a straightforward question and Google knows the straightforward answer, it can make sense to provide that answer directly rather than only the pages of blue links.

This makes an enormous amount of good sense.

Yet it also introduced a massive complication which is at the foundation of the Holocaust denial and other fake news, fake information controversies.

Google Search has earned enormous trust around the world. Users assume that when Google ranks organic results to a query, it does so based on a sound, scientific analysis.

And here’s the absolutely crucial point: It is my belief, based on continuing interactions with Google users and other data I’ve been collecting over an extended period, that most Google users do not commonly differentiate between what Google considers to be “answers” and what Google considers “merely” to be ordinary search result links.

That is, users overall have come to trust Google to such an extent that they assume Google would not respond to a specific question with highly ranked links that are outright lies and falsifications.

Again, Google doesn’t consider all of those to be “specific answers” — Google rather considers the vast majority to be simply the “best” or “most relevant” links based on the internal churning of their algorithm.

Most Google users don’t make this distinction. To them, the highest ranking organic links that appear in response to questions are assumed to likely be the “correct” answers, since they can’t imagine Google knowingly highly ranking fake news or false information in response to such queries.

As Strother Martin’s character “Captain” famously proclaimed in the 1967 film “Cool Hand Luke” – “What we’ve got here is failure to communicate.”

Part of the problem is that Google’s algorithms appear outwardly to be tuned toward topics where specific answers are not controversial. It’s one thing to see a range of user-perceived answers to a question like “What is the best flavor of ice cream?” But when it comes to the truth of the Holocaust for example, there is no room for maneuvering, any more than there is when answering other fact-based questions, such as “Is the moon made of green cheese?”

Many observers are calling for Google to manually eliminate or manually downrank outright lies like the Holocaust denials.

I am unenthusiastic about such approaches. I would much prefer that scalable, automated methods be employed in these contexts whenever possible. Some governments are already proposing false “solutions” that amount to horrific new censorship regimes (that could easily make the existing and terrible EU “Right To Be Forgotten” look like a veritable picnic by comparison).

I would much prefer to see this set of issues resolved via various forms of labeling to indicate highly ranked items that are definitively false (please see: Action Items: What Google, Facebook, and Others Should Be Doing RIGHT NOW About Fake News).

Also important could be explicit notices from Google indicating that they are not endorsing such links in any way and do not represent them as being “correct answers” to the associated queries. A general educational outreach by Google to help users better understand Google’s view of what highly ranked search results actually represent, could also potentially be very useful.

As emotionally upsetting as the fake news and fake information situation has become, especially given the prominent rise of violent, racist, often politically motivated lies in this context, there are definitely ways forward out of this current set of dilemmas, so long as both we and the firms involved acknowledge that serious actions are needed and that the status quo is definitely no longer acceptable.

–Lauren–
I have consulted to Google, but I am not currently doing so — my opinions expressed here are mine alone.
– – –
The correct term is “Internet” NOT “internet” — please don’t fall into the trap of using the latter. It’s just plain wrong!