What Google Needs to Do About YouTube Hate Speech

In the four days since I wrote How Google’s YouTube Spreads Hate, where I discussed both how much I enjoyed and respected YouTube, and how unacceptable their handling of hate speech has become, a boycott by advertisers of YouTube and Google ad networks has been spreading rapidly, with some of the biggest advertisers on the planet pulling their ads over concerns about being associated with videos containing hate speech, extremist, or related content.

It’s turned into a big news story around the globe, and has certainly gotten Google’s attention.

Google has announced some changes and apparently more are in the pipeline, so far relating mostly to making it easier for advertisers to avoid having their ads appear with those sorts of content.

But let’s be very clear about this. Most of that content, much of which is on long-established YouTube channels sometimes with vast numbers of views, shouldn’t be permitted to monetize at all. And in many cases, shouldn’t be permitted on YouTube at all (by the way, it’s a common ploy for YT uploaders to ask for support via third-party sites as a mechanism to evade YT monetization disablement).

The YouTube page regarding hate speech is utterly explicit:

We encourage free speech and try to defend your right to express unpopular points of view, but we don’t permit hate speech.

Hate speech refers to content that promotes violence or hatred against individuals or groups based on certain attributes, such as:

race or ethnic origin
religion
disability
gender
age
veteran status
sexual orientation/gender identity

There is a fine line between what is and what is not considered to be hate speech. For instance, it is generally okay to criticize a nation-state, but not okay to post malicious hateful comments about a group of people solely based on their ethnicity.

Seems pretty clear. But in fact, YouTube is awash with racist, antisemitic, and a vast array of other videos that without question violate these terms, many on established, cross-linked YouTube channels containing nothing but such materials.

How easy is it to stumble into such garbage?

Well, for me here in the USA, the top organic (non-ad) YouTube search result for “blacks” is a video showing a car being wrecked with the title: “How Savage Are Blacks In America & Why Is Everyone Afraid To Discuss It?” — including the description “ban niggaz not guns” — and also featuring a plea to donate to a racist external site.

This video has been on YouTube for over a year and has accumulated over 1.5 million views. Hardly hiding.

While it can certainly can be legitimately argued that there are many gray areas when it comes to speech, on YouTube there are seemingly endless lists of videos that are trivially located and clearly racist, antisemitic, or in violation of YouTube hate speech terms in other ways.

And YouTube helps you find even more of them! On the right-hand suggestion panel right now for the video I mentioned above, there’s a whole list of additional racist videos, including titles like: “Why Are So Many Of U Broke, Black, B!tches [sic] Begging If They Are So Strong & Independent?” — and much worse.

Google’s proper course is clear. They must strongly enforce their own Terms of Service. It’s not enough to provide control over ads, or even ending those ads entirely. Videos and channels that are in obvious violation of the YT TOS must be removed.

We have crossed the Rubicon in terms of the Internet’s impact on society, and laissez-faire attitudes toward hate speech content are now intolerable. The world is becoming saturated in escalating hate speech and related attacks, and even tacit acceptance of these horrors — whether spread on YouTube or by the Trump White House — must be roundly and soundly condemned.

Google is a great company with great people. Now they need to grasp the nettle and do the right thing.

–Lauren–

How Google’s YouTube Spreads Hate

I am one of YouTube’s biggest fans. Seriously. It’s painful for me to imagine a world now without YouTube, without the ability to both purposely find and serendipitously discover all manner of contemporary and historical video gems. I subscribe to YouTube Red because I want to help support great YT creators (it’s an excellent value, by the way).

YouTube is perhaps the quintessential example of a nexus where virtually the entire gamut of Internet policy issues meet and mix — content creation, copyrights, fair use, government censorship, and a vast number more are in play.

The scale and technology of YouTube is nothing short of staggering, and the work required to keep it all running — in terms of both infrastructure and evolving policies, is immense. When I was consulting to Google several years ago, I saw much of this firsthand, as well as having the opportunity to meet many of the excellent people behind the scenes.

Does YouTube have problems? Of course. It would be impossible for an operation of such scope to exist without problems. What we really care about in the long run is how those problems are dealt with.

There is a continual tension between entities claiming copyrights on material and YouTube uploaders. I’ve discussed this in considerable detail in the past, so I won’t get into it again here, other than to note that it’s very easy for relatively minor claimed violations (whether actually accurate or not) to result in ordinary YouTube users having their YouTube accounts forcibly closed, without effective recourse in many cases. And while YouTube has indeed improved their appeal mechanisms in this regard over time, they still have a long way to go in terms of overall fairness.

But a far more serious problem area with YouTube has been in the news repeatedly lately — the extent to which hate speech has permeated the YouTube ecosystem, even though hate speech on YouTube is explicitly banned by Google in the terms of use on this YouTube help page.

Before proceeding, let’s set down some hopefully useful parameters to help explain what I’m talking about here.

One issue that we need to clarify at the outset. The First Amendment to the United States Constitution does not require that YouTube or any other business provide a platform for the dissemination, monetization, or spread of any particular form of speech. The First Amendment applies only to governmental restrictions on speech, which are the true meaning of the term censorship. This is why concepts such as the horrific “Right To Be Forgotten” are utterly unacceptable, as they impose governmentally enforced third-party censorship onto search results.

It’s also often suggested that it’s impossible to really identify hate speech because — some observers argue — everyone’s idea of hate speech is different. Yet from the standpoint of civilized society, we can see that this argument is largely a subterfuge.

For while there are indeed gray areas of speech where even attempting to assign such a label would be foolhardy, there are also areas of discourse where not assigning the hate speech label would require inane and utterly unjustifiable contortions of reality.

Videos from terrorist groups explicitly promoting violence are an obvious example. These are universally viewed as hate speech by all civilized people, and to their credit the major platforms like YouTube, Facebook, et al. have been increasingly leveraging advanced technology to block them, even at the enormous “whack-a-mole” scales at which they’re uploaded.

But now we move on to other varieties of hate speech that have contaminated YouTube and other firms. And while they’re not usually as explicitly violent as terrorist videos, they’re likely even more destructive to society in the long run, with their pervasive nature now even penetrating to the depths of the White House.

Before the rise of video and social media platforms on the Internet, we all knew that vile racists and antisemites existed, but without effective means to organize they tended to be restricted to their caves in Idaho or their Klan clubhouses in the Deep South. With only mimeograph and copy machines available to perpetuate their postal-distributed raving-infested newsletters, their influence was mercifully limited.

The Internet changed all that, by creating wholly new communications channels that permitted these depraved personalities to coordinate and disseminate in ways that are orders of magnitude more effective, and so vastly increasing the dangers that they represent to decent human beings.

Books could be written about the entire scope of this contamination, but this post is about YouTube’s role, so let’s return to that now.

In recent weeks the global media spotlight has repeatedly shined on Google’s direct financial involvement with established hate speech channels on YouTube.

First came the PewDiePie controversy. As YouTube’s most-subscribed star, his continuing dabbling in antisemitic videos — which he insists are just “jokes” even as his Hitler-worship continues — exposed YouTube’s intertwining with such behavior to an extent that Google found itself in a significant public relations mess. This forced Google to take some limited enforcement actions against his YouTube channel. Yet the channel is still up on YouTube. And still monetizing.

Google is in something of a bind here. Having created this jerk, who now represents a significant income stream to himself and the company, it would be difficult to publicly admit that his style of hate is still exceedingly dangerous, as it helps to normalize such sickening concepts. This is true even if we accept for the sake of the argument that he actually means it in a purely “joking” way (I don’t personally believe that this is actually the case, however). For historical precedent, one need only look at how the antisemitic “jokes” in 1930s Germany became a springboard to global horror.

But let’s face it, Google really doesn’t want to give up that income stream by completely demonetizing PewDiePie or removing his channels completely, nor do they want to trigger his army of obscene and juvenile moronic trolls and a possible backlash against YouTube or Google more broadly.

Yet from an ethical standpoint these are precisely the sorts of actions that Google should be taking, since — as I mentioned above — “ordinary” YouTube users routinely can lose their monetization privileges — or be thrown off of YouTube completely — for even relatively minor accused violations of the YouTube or Google Terms of Service.

There’s worse of course. If we term PewDiePie’s trash as relatively “soft” hate speech, we then must look to the even more serious hate speech that also consumes significant portions of YouTube.

I’m not going to give any of these fiends any “link juice” by naming them here. But it’s trivial to find nearly limitless arrays of horrible hate speech videos on YouTube under the names of both major and minor figures in the historical and contemporary racist/antisemitic/alt-right movements.

A truly disturbing aspect is that once you find your way into this depraved area of YouTube, you discover that many of these videos are fully monetized, meaning that Google is actually helping to fund this evil — and is profiting from it.

Perhaps equally awful, if you hit one of these videos’ watch pages, YouTube’s highly capable suggestion engine will offer you a continuous recommended stream of similar hate videos over on the right-hand side of the page — even helpfully surfacing additional hate speech channels for your enjoyment. I assume that if you watched enough of these, the suggestion panels on the YouTube home page would also feature these videos for you.

Google’s involvement with such YouTube channels became significant news over the last couple of weeks, as major entities in the United Kingdom angrily pulled their advertising after finding it featured on the channels of these depraved hatemongers. Google quickly announced that they’d provide advertisers with more controls to help avoid this in the future, but this implicitly suggests that Google doesn’t plan actions against the channels themselves, and Google’s “we don’t always get it right” excuse is wearing very, very thin given the seriousness of the situation.

Even if we completely inappropriately consider such hate speech to be under the umbrella of acceptable speech, what we see on YouTube today in this context is not merely providing a “simple” platform for hate speech — it’s providing financial resources for hate speech organizations, and directly helping to spread their messages of hate.

I explicitly assume that this has not been Google’s intention per se. Google has tried to take a “hands off” attitude toward “judging” YouTube videos as much as possible. But the massive rise in hate-based speech and attacks around the world, including (at least tacitly) to the highest levels of the U.S. federal government under the Trump administration, are clear and decisive signals that this is no longer a viable course for an ethical and great company like Google.

It’s time for Google to extricate YouTube from its role as a partner in hate. That this won’t come without significant pain and costs is a given.

But it’s absolutely the correct path for Google to take — and we expect no less from Google.

–Lauren–

Google and Older Users

Alphabet/Google needs at least one employee dedicated to vetting their products on a continuing basis for usability by older users — an important and rapidly growing demographic of users who are increasingly dependent on Google services in their daily lives.

I’m not talking here about accessibility in general, I’m talking about someone whose job is specifically to make sure that Google’s services don’t leave older users behind due to user interface and/or other associated issues. Otherwise, Google is essentially behaving in a discriminatory manner, and the last thing that I or they should want to see is the government stepping in (via the ADA or other routes) to mandate changes.

–Lauren–

“Google Experiences” Submission Page Now Available

Recently in Please Tell Me Your Google Experiences For “Google 2017” Report, I solicited experiences with Google — positive, negative, neutral, or whatever — for my upcoming “Google 2017” white paper report.

The response level has been very high and has led me to create a shared, public Google Doc to help organize such submissions.

Please visit the Google Experiences Suggestions Page to access that document, through which you may submit suggested text and/or other information. You do not need to be logged into a Google account to do this.

Thanks again very much for your participation in this effort!

–Lauren–

Simple Solutions to “Smart TVs” as CIA Spies

I’m being bombarded with queries about Samsung “Smart TVs” being used as bugs by the CIA, as discussed in the new WikiLeaks data dump.

I’m not in a position to write up anything lengthy about this right now, but there is a simple solution to the entire “smart TV as bug” category of concerns — don’t buy those TVs, and if you have one, don’t connect it to the Internet directly.

Don’t associate it with your Wi-Fi network — don’t plug it into your Ethernet.

Buy a Chromecast or Roku or similar dongle that will provide your Internet programming connectivity via HDMI to that television — these dongles don’t include microphones and are dirt cheap compared to the price of the TV itself.

In general, so-called smart TVs are not a good buy even when they’re not acting as bugs.

Now, seriously paranoid readers might ask “Well, what if the spooks are subverting both my smart TV and my external dongle? Couldn’t they somehow route the audio from the TV microphone back out to the Internet through hacked firmware in the dongles?”

The answer is theoretically yes, but it’s a significantly tougher lift for a number of technical reasons. The solution though even for that scenario is simple — kill the power to the dongle when you’re not using it.

Unplug it from the TV USB jack if you’re powering it that way (I mean, if you’re paranoid, you might consider the possibility that the hacked TV firmware is still supplying power to the dongle even when it’s supposed to be off, and that the dongle has been hacked to not light its power LED in that situation, eh?)

But if you’re powering the dongle from a wall adapter, and you unplug that, you’ve pretty much ended that ballgame.

–Lauren–