YouTube’s Excellent New Moves Against Hate Speech — But There’s More Work for Google to Do

In my March blog posts — “How YouTube’s User Interface Helps Perpetuate Hate Speech” (https://lauren.vortex.com/2017/03/26/how-youtubes-user-interface-helps-perpetuate-hate-speech), and  “What Google Needs to Do About YouTube Hate Speech” (https://lauren.vortex.com/2017/03/23/what-google-needs-to-do-about-youtube-hate-speech), I was quite critical of how Google is handling certain aspects of their own Terms of Service enforcement on YouTube.

In “Four steps we’re taking today to fight online terror” (https://blog.google/topics/google-europe/four-steps-were-taking-today-fight-online-terror/), Google’s General Counsel Kent Walker (a straight-arrow guy whom it’s been my pleasure to meet) announced YouTube changes aimed at dealing more effectively with extremist videos and hate speech more broadly.

Key aspects of these changes appear to be in line with my public suggestions — in particular, faster takedowns for extremist content, and disqualification of hate speech videos from monetization and “suggested video” systems, are excellent steps forward.

I would prefer that hate speech videos not only be demonetized and “hidden” from suggestions — but that they’d be removed from the YouTube platform entirely. I am not at this point fully convinced that sweeping that kind of rot “under the carpet” — where it can continue to fester — is a practical long-term solution. However, we shall see. I will be watching with interest to determine how these policies play out in practice.

As I’ve noted in earlier posts, I also feel strongly that Google needs to make it more “in your face” obvious to YouTube users that they can report offending videos. I had previously mentioned that the YouTube “Report” link — that years ago was on the top-level YouTube user interface — seemed to have returned to that position (at least for YouTube Red subscribers) after a long period being buried under the top level “More” link. At the time, I speculated that this might only be an ephemeral user-facing experiment, and in fact for me at least the “Report” link is again hiding under the “More” link.

I’ve discussed this problem before. Users who might otherwise report an offending video are much less likely to do so if a “Report” link isn’t obvious. I acknowledge that one possible reason for “hiding” the “Report” link is concerns about false positives. Indeed, in Kent’s post today, he mentions the high accuracy of YouTube “Trusted Flaggers” — which suggests that my speculation in this regard (about reports from users overall) was likely correct. In any case, I still feel that a top-level user interface “Report” link is a very important element for consideration.

While I do feel that there’s more that Google needs to do in various of these regards concerning extremist and hate speech, I am indeed cognizant of their understandable desire to move in carefully calibrated steps given the impact of any changes at Google scale. And yeah, I’m indeed pleased to see Google moving these issues in the overall direction that I’ve been publicly urging.

My kudos to the associated Google/YouTube teams — and we’ll all be watching to see how these changes play out in the fullness of time.

Be seeing you.

–Lauren–