September 23, 2012

How to Destroy YouTube

World events are continuing to keep Google's YouTube in the news, as a maelstrom of arguments about free speech vs. censorship spin around the vortex of continuing Mideast unrest.

I have noted previously that I am a tremendous fan of YouTube, even while I've expressed concerns about various of the complex issues that Google faces in keeping YouTube running as a global service.

Recently, in Saving YouTube from the Choking Web of Rights, I discussed YouTube matters associated with copyright and Content ID, and in YouTube Blocking the Anti-Islamic Video: Censorship or Responsible Stewardship?, I explicitly endorsed YouTube's selective and limited blocking of a notorious anti-Islamic video.

Controversies surrounding that video and YouTube are continuing unabated, however.

The New York Times yesterday, in Free Speech in the Age of YouTube, discussed Columbia professor Tim Wu's recent proposal for a sort of "external oversight board" to make tough decisions about controversial YouTube video takedown situations.

I have great respect for Tim, and we both agree that Google was correct in their targeted blocking of the anti-Islamic video in the current case.

And to his credit, Tim lays out in his proposal some of the reasons why such a plan might not succeed.

Unfortunately, I feel forced to go further down this latter path, and suggest that the proposal -- if implemented -- would not only fail, but also could make the situation regarding YouTube and censorship questions far more complex and problematic, potentially leaving us in a much worse place than where we started.

One reason for this should be pretty obvious. An external group making such decisions likely wouldn't have any significant "skin in the game" in a legal sense. They could make whatever decisions they wanted, with deference to free speech and high ideals, yet Google would ultimately be the party legally vulnerable to any negative results of those decisions.

It's difficult in the extreme to imagine Google being willing to cede such decision-making authority to an outside group. I'd certainly be unwilling to do so if I were in Google's position.

Even if such a group operated only in an adversary capacity, so long as its decisions were officially made public (or leaked publicly), there would be enormous pressure on Google to conform with such decisions, even when internal Google data and knowledge indicated that this would unwise for any number of reasons not publicly known.

The logistics of Tim's proposal also seem largely unworkable. He acknowledges that (for example) trying to herd YouTube users into some sort of advisory capacity could be difficult, but invokes Wikipedia as an example of a possible model.

Though I admire much of what Wikipedia has accomplished, it seems like a very poor model for a YouTube advisory system. Wikipedia is often accused -- with considerable merit -- of not taking responsibility for inaccuracies in its materials. Battles over edits and changes are legendary, even while all manner of known errors persist throughout its corpus. Political and commercial battles are fought by proxy among Wikipedia editors, usually by anonymous parties of unknown expertise operating under pseudonyms.

This is not the kind of structure appropriate for making decisions about YouTube videos in the sort of charged atmosphere we're dealing with today, with potential immediate and long-term real-world consequences for lives, property, and more.

The proper venue for these decisions is internally at Google. They are in the best position to determine how to react to these situations, including demands from governments, other organizations, and individuals. Google ultimately is the party with any legal exposure from these decisions, so the decisions must be theirs.

If Google feels it appropriate to privately reach out to external experts and observers for input relating to these situations, all well and good. Perhaps they're already doing that. But this can't reasonably take place in the hothouse of the public sphere, where every comment or speculation will likely trigger endless arguing, exploitation, and trolling that will only inflame passions, not lead toward reasoned decisions.

Personally, I have faith that Google will do their best to weigh the many involved factors honestly, and make the best possible decisions in an area where we are faced almost entirely with shades of gray, rarely with black and white certainties. Not only does YouTube's overall trajectory over time suggest this path, but it's also in Google's own best interests to navigate this course with all possible diligence and care.

And frankly, I believe that any moves toward external decision-making in these regards would also tend to inevitably open the door to a slippery slope of outside pressures that would ultimately take aim at effective control of more than just YouTube, attempting to also gain strangleholds on search results and other affiliated services as well -- a nightmare scenario we must avoid at all costs.

That's my opinion, anyway.


Posted by Lauren at September 23, 2012 11:14 AM | Permalink
Twitter: @laurenweinstein
Google+: Lauren Weinstein