How Google’s YouTube Spreads Hate

I am one of YouTube’s biggest fans. Seriously. It’s painful for me to imagine a world now without YouTube, without the ability to both purposely find and serendipitously discover all manner of contemporary and historical video gems. I subscribe to YouTube Red because I want to help support great YT creators (it’s an excellent value, by the way).

YouTube is perhaps the quintessential example of a nexus where virtually the entire gamut of Internet policy issues meet and mix — content creation, copyrights, fair use, government censorship, and a vast number more are in play.

The scale and technology of YouTube is nothing short of staggering, and the work required to keep it all running — in terms of both infrastructure and evolving policies, is immense. When I was consulting to Google several years ago, I saw much of this firsthand, as well as having the opportunity to meet many of the excellent people behind the scenes.

Does YouTube have problems? Of course. It would be impossible for an operation of such scope to exist without problems. What we really care about in the long run is how those problems are dealt with.

There is a continual tension between entities claiming copyrights on material and YouTube uploaders. I’ve discussed this in considerable detail in the past, so I won’t get into it again here, other than to note that it’s very easy for relatively minor claimed violations (whether actually accurate or not) to result in ordinary YouTube users having their YouTube accounts forcibly closed, without effective recourse in many cases. And while YouTube has indeed improved their appeal mechanisms in this regard over time, they still have a long way to go in terms of overall fairness.

But a far more serious problem area with YouTube has been in the news repeatedly lately — the extent to which hate speech has permeated the YouTube ecosystem, even though hate speech on YouTube is explicitly banned by Google in the terms of use on this YouTube help page.

Before proceeding, let’s set down some hopefully useful parameters to help explain what I’m talking about here.

One issue that we need to clarify at the outset. The First Amendment to the United States Constitution does not require that YouTube or any other business provide a platform for the dissemination, monetization, or spread of any particular form of speech. The First Amendment applies only to governmental restrictions on speech, which are the true meaning of the term censorship. This is why concepts such as the horrific “Right To Be Forgotten” are utterly unacceptable, as they impose governmentally enforced third-party censorship onto search results.

It’s also often suggested that it’s impossible to really identify hate speech because — some observers argue — everyone’s idea of hate speech is different. Yet from the standpoint of civilized society, we can see that this argument is largely a subterfuge.

For while there are indeed gray areas of speech where even attempting to assign such a label would be foolhardy, there are also areas of discourse where not assigning the hate speech label would require inane and utterly unjustifiable contortions of reality.

Videos from terrorist groups explicitly promoting violence are an obvious example. These are universally viewed as hate speech by all civilized people, and to their credit the major platforms like YouTube, Facebook, et al. have been increasingly leveraging advanced technology to block them, even at the enormous “whack-a-mole” scales at which they’re uploaded.

But now we move on to other varieties of hate speech that have contaminated YouTube and other firms. And while they’re not usually as explicitly violent as terrorist videos, they’re likely even more destructive to society in the long run, with their pervasive nature now even penetrating to the depths of the White House.

Before the rise of video and social media platforms on the Internet, we all knew that vile racists and antisemites existed, but without effective means to organize they tended to be restricted to their caves in Idaho or their Klan clubhouses in the Deep South. With only mimeograph and copy machines available to perpetuate their postal-distributed raving-infested newsletters, their influence was mercifully limited.

The Internet changed all that, by creating wholly new communications channels that permitted these depraved personalities to coordinate and disseminate in ways that are orders of magnitude more effective, and so vastly increasing the dangers that they represent to decent human beings.

Books could be written about the entire scope of this contamination, but this post is about YouTube’s role, so let’s return to that now.

In recent weeks the global media spotlight has repeatedly shined on Google’s direct financial involvement with established hate speech channels on YouTube.

First came the PewDiePie controversy. As YouTube’s most-subscribed star, his continuing dabbling in antisemitic videos — which he insists are just “jokes” even as his Hitler-worship continues — exposed YouTube’s intertwining with such behavior to an extent that Google found itself in a significant public relations mess. This forced Google to take some limited enforcement actions against his YouTube channel. Yet the channel is still up on YouTube. And still monetizing.

Google is in something of a bind here. Having created this jerk, who now represents a significant income stream to himself and the company, it would be difficult to publicly admit that his style of hate is still exceedingly dangerous, as it helps to normalize such sickening concepts. This is true even if we accept for the sake of the argument that he actually means it in a purely “joking” way (I don’t personally believe that this is actually the case, however). For historical precedent, one need only look at how the antisemitic “jokes” in 1930s Germany became a springboard to global horror.

But let’s face it, Google really doesn’t want to give up that income stream by completely demonetizing PewDiePie or removing his channels completely, nor do they want to trigger his army of obscene and juvenile moronic trolls and a possible backlash against YouTube or Google more broadly.

Yet from an ethical standpoint these are precisely the sorts of actions that Google should be taking, since — as I mentioned above — “ordinary” YouTube users routinely can lose their monetization privileges — or be thrown off of YouTube completely — for even relatively minor accused violations of the YouTube or Google Terms of Service.

There’s worse of course. If we term PewDiePie’s trash as relatively “soft” hate speech, we then must look to the even more serious hate speech that also consumes significant portions of YouTube.

I’m not going to give any of these fiends any “link juice” by naming them here. But it’s trivial to find nearly limitless arrays of horrible hate speech videos on YouTube under the names of both major and minor figures in the historical and contemporary racist/antisemitic/alt-right movements.

A truly disturbing aspect is that once you find your way into this depraved area of YouTube, you discover that many of these videos are fully monetized, meaning that Google is actually helping to fund this evil — and is profiting from it.

Perhaps equally awful, if you hit one of these videos’ watch pages, YouTube’s highly capable suggestion engine will offer you a continuous recommended stream of similar hate videos over on the right-hand side of the page — even helpfully surfacing additional hate speech channels for your enjoyment. I assume that if you watched enough of these, the suggestion panels on the YouTube home page would also feature these videos for you.

Google’s involvement with such YouTube channels became significant news over the last couple of weeks, as major entities in the United Kingdom angrily pulled their advertising after finding it featured on the channels of these depraved hatemongers. Google quickly announced that they’d provide advertisers with more controls to help avoid this in the future, but this implicitly suggests that Google doesn’t plan actions against the channels themselves, and Google’s “we don’t always get it right” excuse is wearing very, very thin given the seriousness of the situation.

Even if we completely inappropriately consider such hate speech to be under the umbrella of acceptable speech, what we see on YouTube today in this context is not merely providing a “simple” platform for hate speech — it’s providing financial resources for hate speech organizations, and directly helping to spread their messages of hate.

I explicitly assume that this has not been Google’s intention per se. Google has tried to take a “hands off” attitude toward “judging” YouTube videos as much as possible. But the massive rise in hate-based speech and attacks around the world, including (at least tacitly) to the highest levels of the U.S. federal government under the Trump administration, are clear and decisive signals that this is no longer a viable course for an ethical and great company like Google.

It’s time for Google to extricate YouTube from its role as a partner in hate. That this won’t come without significant pain and costs is a given.

But it’s absolutely the correct path for Google to take — and we expect no less from Google.


Google and Older Users

Alphabet/Google needs at least one employee dedicated to vetting their products on a continuing basis for usability by older users — an important and rapidly growing demographic of users who are increasingly dependent on Google services in their daily lives.

I’m not talking here about accessibility in general, I’m talking about someone whose job is specifically to make sure that Google’s services don’t leave older users behind due to user interface and/or other associated issues. Otherwise, Google is essentially behaving in a discriminatory manner, and the last thing that I or they should want to see is the government stepping in (via the ADA or other routes) to mandate changes.


“Google Experiences” Submission Page Now Available

Recently in Please Tell Me Your Google Experiences For “Google 2017” Report, I solicited experiences with Google — positive, negative, neutral, or whatever — for my upcoming “Google 2017” white paper report.

The response level has been very high and has led me to create a shared, public Google Doc to help organize such submissions.

Please visit the Google Experiences Suggestions Page to access that document, through which you may submit suggested text and/or other information. You do not need to be logged into a Google account to do this.

Thanks again very much for your participation in this effort!


Simple Solutions to “Smart TVs” as CIA Spies

I’m being bombarded with queries about Samsung “Smart TVs” being used as bugs by the CIA, as discussed in the new WikiLeaks data dump.

I’m not in a position to write up anything lengthy about this right now, but there is a simple solution to the entire “smart TV as bug” category of concerns — don’t buy those TVs, and if you have one, don’t connect it to the Internet directly.

Don’t associate it with your Wi-Fi network — don’t plug it into your Ethernet.

Buy a Chromecast or Roku or similar dongle that will provide your Internet programming connectivity via HDMI to that television — these dongles don’t include microphones and are dirt cheap compared to the price of the TV itself.

In general, so-called smart TVs are not a good buy even when they’re not acting as bugs.

Now, seriously paranoid readers might ask “Well, what if the spooks are subverting both my smart TV and my external dongle? Couldn’t they somehow route the audio from the TV microphone back out to the Internet through hacked firmware in the dongles?”

The answer is theoretically yes, but it’s a significantly tougher lift for a number of technical reasons. The solution though even for that scenario is simple — kill the power to the dongle when you’re not using it.

Unplug it from the TV USB jack if you’re powering it that way (I mean, if you’re paranoid, you might consider the possibility that the hacked TV firmware is still supplying power to the dongle even when it’s supposed to be off, and that the dongle has been hacked to not light its power LED in that situation, eh?)

But if you’re powering the dongle from a wall adapter, and you unplug that, you’ve pretty much ended that ballgame.


Google’s New “YouTube TV” Is a Gift to Donald Trump

As if it wasn’t bad enough that so many high-ranking Google search results were hijacked by criminals monetizing false news stories toward getting Donald Trump elected, it appears that (for the moment at least), Google’s new “YouTube TV” offering is a gift package for serial lying sociopath Donald Trump and his vile supporters.

YouTube Live is Google’s newly announced attempt to push cable “cord cutting” — that is, encouraging people to drop their conventional cable or satellite TV subscriptions, and switch to viewing Internet-delivered streams.

The YouTube Live offering seems fairly conventional at first glance and Google has tossed in useful stuff like multiple users and free time-shifting/DVR capabilities.

But a glaring omission from their channel lineup makes YouTube Live a massive prize package for Donald Trump and his fascist agenda — FOX “News” is included in the lineup, but CNN is nowhere to be found. Go ahead, try and find it. I sure can’t.

It appears that Google is hoping that viewers will accept MSNBC as a substitute for CNN — but that’s ridiculous in the extreme. Not including CNN is giving FOX “News” an enormous boost, and those right-wing News Corp. bastards have already done enough damage to this country without Google giving FOX and Trump this additional big wet kiss squarely on their rotting lips.

No doubt Google will say that they couldn’t reach a licensing agreement with CNN/Time Warner and golly gee we hope to add them onto the lineup soon.

To hell with that. How long will it be before FOX and Trump are ranting claims that Google chose FOX “News” because Google doesn’t trust CNN? Launching this service including FOX “News” but not including CNN is the height of irresponsibility, especially in today’s political environment.

Shame on you Google. Shame on you.