A Proposal to Google: How to Stop Evil from Trending on YouTube


Late last year, in the wake of the Las Vegas shooting tragedy (I know, keeping track of USA mass shootings is increasingly difficult when they’re increasingly common) I suggested in general terms some ways that YouTube could avoid spreading disinformation and false conspiracy theories after these kinds of events:

“Vegas Shooting Horror: Fixing YouTube’s Continuing Fake News Problem” – https://lauren.vortex.com/2017/10/05/vegas-horror-fixing-youtube-fake-news

I’ve also expressed concerns that YouTube’s current general user interface does not encourage reporting of hate or other abusive videos:

“How YouTube’s User Interface Helps Perpetuate Hate Speech” – https://lauren.vortex.com/2017/03/26/how-youtubes-user-interface-helps-perpetuate-hate-speech

Now, here we are again. Another mass shooting. Another spew of abusive, dangerous, hateful, false conspiracy and other related videos on YouTube that clearly violate YouTube’s Terms of Use but still managed to push high up onto YouTube trending lists — this time aimed at vulnerable student survivors of the Florida high school tragedy of around a week ago.

Google has stated that the cause for one of the worst of these reaching top trending YouTube status was an automated misclassification due to an embedded news report, that “tricked” YouTube’s algorithms into treating the entire video as legitimate.

No algorithms are perfect, and YouTube’s scale is immense. But this all begs the question — would a trained human observer have made the same mistake?

No. It’s very unlikely that a human who had been trained to evaluate video content would have been fooled by such an embedding technique.

Of course as soon as anyone mentions “humans” in relation to analysis of YouTube videos, various questions of scale pop immediately into focus.  Hundreds of hours of content are uploaded to YouTube every minute. YouTube’s scope is global, so this data firehose includes videos concerning pretty much any conceivable topic in a vast array of languages.

Yet Google is not without major resources in these regards. They’ve publicly noted that they have significantly-sized teams to review videos that have been flagged by users as potentially abusive, and have announced that they are in the process of expanding those teams.

Still, the emphasis to date has seemed to be on removing abusive videos “after the fact” — often after they’ve already quickly achieved enormous view counts and done significant damage to victims.

A more proactive approach is called for.

One factor to keep in mind is that while very large numbers of videos are continuously pouring into YouTube, the vast majority of these will never quickly achieve high numbers of views. These are what comprise the massive “long tail” of YouTube videos.

Conversely, at any given time only a relative handful of videos are trending “viral” and accumulating large numbers of views in very short periods of time.

While any and all abusive videos are of concern, as a practical matter we need to direct most of our attention to those trending videos that can do the most damage the most quickly.  We must not permit the long tail of less viewed videos to distract us from promptly dealing with abusive videos that are currently being seen by huge and rapidly escalating numbers of viewers.

YouTube employees need to be more deeply “in the loop” to curate trending lists much earlier in the process.

As soon as a video goes sufficiently viral to technically “qualify” for a trending list, it should be immediately pushed to humans — to the YouTube abuse team — for analysis before the video is permitted to actually “surface” on any of those lists.

If the video isn’t abusive or otherwise in violation of YouTube rules, onto the trending list it goes and it likely won’t need further attention from the team. But if it is in violation, the YouTube team would proactively block it from ever going onto trending, and would take other actions related to that video as appropriate (which could include removal from YouTube entirely, strikes or other actions against the uploading YouTube account, and so on).

There simply is no good reason today for horrifically abusive videos appearing on YouTube trending lists, and even worse in some cases persisting on those lists for hours, even rising to top positions — giving them enormous audiences and potentially doing serious harm.

Yes, fixing this will be significant work.

Yes, this won’t be cheap to do.

And yes, I believe that Google has the capabilities to accomplish this task.

The dismal alternative is the specter of knee-jerk, politically-motivated censorship of YouTube by governments, actions that could effectively destroy much of what makes YouTube a true wonder of the world, and one of my all-time favorite sites on the Internet.

–Lauren–

Why the Alt-Right Loves Google's Diversity Conundrum
Why I Finally Dumped Netflix (and Love FilmStruck/Criterion)