Why YouTube’s New Plan to Debunk Conspiracy Videos Won’t Work


YouTube continues to try figure out ways to battle false conspiracy videos that rank highly on YouTube — sometimes even into the top trending lists — and that can spread to ever more viewers via YouTube’s own “recommended videos” system. I’ve offered a number of suggestions for dealing with these issues, most recently in “Solving YouTube’s Abusive Content Problems — via Crowdsourcing” (https://lauren.vortex.com/2018/03/11/solving-youtubes-abusive-content-problems-via-crowdsourcing).

YouTube has now announced a new initiative that they’re calling “information cues” — which they hope will address some of these problems.

Unfortunately, this particular effort (at least as being reported today) is likely doomed to be almost entirely ineffective.

The idea of “information cues” is to provide false conspiracy YouTube videos with links to Wikipedia pages that “debunk” those conspiracies. So, for example, a video claiming that the Florida student shooting victims were actually “crisis actors” would presumably show a link to a Wikipedia page that explains why this wasn’t actually the case.

You probably already see the problems with this approach.

We’ll start with the obvious elephant in the room. The kind of viewers who are going to believe these kinds of false conspiracy videos are almost certainly going to say that the associated Wikipedia articles are wrong, that they’re planted lies. FAKE NEWS!

Do we really believe that anyone who would consider giving such videos even an inch of credibility is going to be convinced otherwise by Wikipedia pages? C’mon! If anything, such Wikipedia pages may actually serve to enforce these viewers’ beliefs in the original false conspiracy videos!

Not helping matters at all is that Wikipedia’s reputation for accuracy — never all that good — has been plunging in recent years, sometimes resulting in embarrassing Knowledge Panel errors for Google in search results.

Any Wikipedia page that is not “protected” — that is, where the ordinary change process has been locked out — is subject to endlessly mutating content editing wars — and you can bet that any editable Wikipedia pages linked by YouTube from false conspiracy videos would become immediate high visibility targets for such attacks.

If there’s one thing that research into this area has already shown quite conclusively, it’s that the people who believe these kinds of garbage conspiracy theories are almost entirely unconvinced by any factual information that conflicts with their inherent points of view.

The key to avoiding the contamination caused by these vile, lying, false conspiracy videos is to minimize their visibility in the YouTube/Google ecosystem in the first place.

Not only should they be prevented from ever getting into the trending lists, they should be deranked, demonetized, and excised from the YouTube recommended video system. They should be immediately removed from YouTube entirely if they contain specific attacks against individuals or other violations of the YouTube Terms of Service and/or Community Guidelines. These actions must be taken as rapidly as possible with appropriate due diligence, before these videos are able to do even more damage to innocent parties.

Nothing less can keep such disgusting poison from spreading.

–Lauren–