Greetings. Here we go. The FCC is about to release a report to Congress suggesting that lawmakers enact legislation controlling "violent content" -- not only on broadcast television but apparently on "basic cable" channels as well. (See this Washington Post article.)
The legal battles that would certainly result will be breathtaking, and it seems rather likely that calls to extend such regulations to the Internet will follow. If you think that defining obscenity is tough, just wait until we see people of all stripes arguing about the relative violence of a stick of TNT blowing Daffy Duck's bill around his head.
That's Not All, Folks! More after the report is released ...
Greetings. As search engines, ISPs, and other Internet-connected enterprises collect and archive increasing amounts of data on our activities, it is inevitable that governments and other entities will come to view that data as ever more invaluable aids in accomplishing their particular agendas, for good or ill.
In most cases this data is kept voluntarily by the various services, though around the world moves toward government-mandated data retention are rapidly being pushed, mainly in the name of law enforcement.
Often people don't have a clue as to how much information is being kept regarding their search and browsing activities. Google's announcement of a new "Web History" feature -- allowing Google users to track not only their Google searches but also most of their other Web browsing activities, will likely have the positive benefit of clearly demonstrating to people just how much data really is being collected--bringing this out into the open, as it were.
A quick digression regarding Google Web History. I've received some alarmed queries about it, but in my view it doesn't actually change the level of information being collected by Google, rather it makes some of that data available to users for the first time.
Note that non-Google Web browsing data would only be sent to Google in the case of Google Toolbar users with the PageRank feature turned on. This has been a standard (documented by Google in their privacy statement) feature of Toolbar for quite some time -- the PageRank toolbar feature couldn't work without it.
We're seeing ever more cases of various governments demanding access to the data collected by search engines and ISPs. DOJ vs. Google was one celebrated case, and while Google fought on various grounds, they ended up having to turn over considerable data. Google, Yahoo, and others have all been in the spotlight for providing foreign governments with data on particular users. In the last few days, Yahoo has been sued by the family of a Chinese man being imprisoned for years based on data turned over by Yahoo. The suit asserts that Yahoo is complicit in his arrest and claimed torture.
Ultimately, all of these organizations make a demonstrably true statement to explain their actions -- "We must obey the laws in the countries where we operate." Absolutely correct. No question about it. You want to play ball with somebody else's ball, you play by their rules.
But in the Yahoo case, an additional comment by their spokesman caught my attention. He noted that they simply hand over data when ordered, they don't know what it's used for, and usually never hear about it directly again. In other words, they simply obey orders.
That statement might have perhaps slid by if I hadn't recently been watching the fine old 1961 film "Judgment at Nuremberg" -- where characters on trial used almost the same words to describe their actions and rationalizations in a different context involving arrests, torture, and worse.
This is not to directly compare the current situation with corporate complicity in Nazi Germany, but only to point out that actions have consequences, and those consequences can result in suffering, pain, and even death in some parts of the world, a universe away from our glowing screens and merrily typing fingers.
Which leads us inevitably to The Questions. When do we cross the threshold beyond which it is ethically inappropriate to "play ball" in certain locales by rules that can have repugnant effects on individual lives, despite our services bringing very significant benefits to large populations in those areas? At what stage should "business as usual" take a back seat to these ethical concerns?
These are not simple questions; the calculus of ethics is not always straightforward in the modern world, as much as we'd like to think it was.
But I believe that we're rapidly reaching a point in the development of the Internet where such questions must be addressed, and it's inevitable that they will be -- either by the involved firms themselves, by government legislative and other actions, or both.
When it comes to collecting and turning over data that can result in real harm to real people, "We were just following orders" -- even as the admitted cost of doing business -- seems unlikely to be a tenable response for much longer.
I invite broad discussion and dialogue on these questions. Humanity -- and the continued flowering of the Internet and its wonders -- will depend on the answers.
Greetings. In the wake of the tragedy at Virginia Tech, I'm seeing numerous news report speculations that mass cell phone text messaging should have been used to warn and inform the VT population about what was going on.
As I've discussed before in other venues, the current text messaging (SMS) infrastructure has relatively low capacity, and can be easily overloaded in ways that could not only delay the delivery of such immediate action warnings, but also could disrupt cellular voice communications as well, just when voice calling capability may be most needed.
An important technical paper that discusses SMS overloading issues in detail is very much worth reading.
Greetings. I've never been a fan of John Donald "Don" Imus, Jr., nor of most other "shock jocks" who have paralleled his radio career in various ways -- some of whom have taken their own (not always permanent) falls from the airwaves. I personally find the sort of baiting that is so frequently the staple of their programs to be distasteful, a coarsening of dialogue that pushes society ever farther into yelling at each other rather than having reasonable discussions.
Be that as it may, I find Imus' own sudden fall to be particularly disturbing as a possible harbinger of major Internet risks to come, particularly affecting free speech.
Broadcasting giants like NBC and CBS are of course free to make whatever legal business decisions they see fit, however hypocritical they may seem in this case, given the continuing flow of hateful music lyrics that bring in the big bucks and cause Imus' now notorious comment to pale by comparison.
So was it really concern about ethics that drove the networks' decisions, or rather a much more straightforward business calculation about threatened boycotts driven by solemn-sounding speeches by the self-proclaimed "masters of morality" -- loudly demanding advertiser and even FCC action? That some of the most visible personages calling for Imus' blood have themselves closets full of hateful skeletons didn't seem to matter as the 24-hour news cycle ramped up the volume.
But this is all but a single note in a much more nightmarish tune that is starting to take shape. The morality guardians are pushing the envelope in every direction, and even well-meaning beginnings could easily turn toward theocratical oppression, particularly where the Internet is concerned.
Farfetched? Let's keep recent history in mind. We've seen forces inside and outside of government pushing DOJ and the FCC for ever broader regulation of "dirty words" and "wardrobe malfunctions" -- working hard to "protect" us from the evils of sexual thoughts, while the blood continues to flow freely in Iraq. Indeed, there are many ways to define obscenity.
But we're also hearing calls to expand regulation beyond obscenity to include "racist," "hateful," or "sexist" speech, and not just on the broadcast airwaves but on cable and satellite as well.
Once we've made that leap -- or even before -- the Internet will be even more directly in the crosshairs, and the most obvious targets will be the biggest ones -- the video sites like Google's YouTube, and even the search engine and caching functions of Google itself and similar competing operations. Even arguing that one is merely organizing information in the case of search engines will never satisfy those who would attempt to impose ever broader censorship in the name of popular morality.
Keep in mind that the Child Online Protection Act (COPA) is still bouncing around in the courts, and may someday bounce out into the real world with a devastating dumbing down and eviscerating of many U.S. Internet sites, at least those operating openly.
This goes far beyond sex and morality wars. We're seeing increasing cases of government-imposed censorship attempts, sometimes of an international reach. Thailand and Turkey are two very recent examples, and "offensive" YouTube materials are now becoming a popular target of removal demands from national governments, based on their local sensibilities. Attempts to block Internet materials of course involve other countries as well, most notably China, despite the inefficacy of such efforts in the long run, given the mutability of the Internet and the ease with which underground sites can be established.
But that's not to say that this rush toward censorship won't be incredibly disruptive even as it leaks profusely, as various targets will be chosen for prosecutorial or other damaging attention to "be made examples of" for the education of the masses.
Perhaps we've seen the first clear inkling of how that might come to pass as we review the case of Imus, for in his fall we clearly see the forces of censorship girding their loins for action, and nailing Imus, for all his faults and distasteful comments over the years, to a cross that was hastily erected in opportunistic glee.
We can be sure that there's plenty more wood and nails being collected and made ready, so long as we allow ourselves to be bullied by those who would disintegrate ever more aspects of our precious free speech, in the name of their own perceived righteousness.
Greetings. I've been getting a bunch of requests for opinions on the "Blogger's Code of Conduct" being discussed over on Tim O'Reilly's blog.
I don't consider myself to be a high-power blogger. I started my blog back in late 2003 at the urging of Joi Ito, and I post to it sporadically -- only when I feel that I have something hopefully useful to say. I've never allowed comments on my blog, and I disabled trackbacks as soon as they began to be abused. My discussion mailing lists and forums have always been moderated, since day one.
Old-timers who have long known me may recall my predictions decades ago regarding unmoderated communications media, originally in the context of the Usenet network. In a nutshell (no pun intended, Tim!) I've long expected that as access to unmoderated "broadcasting" technologies became available to the public at large, we'd see that the fraction of a percent of "bad actors" would wield asymmetric power to damage and disrupt the good intentions of everyone else.
That's not rocket science -- it's just human nature and the power of these technologies at work.
Actually, it's a twin-pronged issue, that also relates to (inevitably doomed) attempts to censor Internet content. This is the case whether we're talking about U.S. laws ostensibly to protect children against "objectionable" material, or countries like China, Turkey, and Thailand (the latter two especially in the news right now in this context) who believe -- in error -- that they can control what their populations will access on the Net.
But as far as blogs and other public Internet forums are concerned, as long as unmoderated submissions are permitted -- even with promises of pulling down objectionable submissions as rapidly as possible -- the asymmetric power to damage those forums and related persons/organizations will remain.
Of course, if folks want to run totally unmoderated environments I believe that's their choice, but I do feel that they should be held esponsible for the results, and the operators of such systems should usually not be able to hide beyond a cloak of anonymity in most cases of abuse. I say "usually" and "most" because there will always be exceptional cases, especially in critical human rights arenas. But in general, if you create the forum, my view is that simply saying it's unmoderated doesn't buy you some sort of exception from responsibility.
It's not clear to me that requiring submitters of blog comments to include even "valid" e-mail addresses would accomplish much, given the ease of creating and disposing of throwaway addresses. In many contexts, an anonymous comment (if approved by a moderator) could be just as valuable as a "fully attributed" one -- though this will vary with the type of material under discussion.
Likewise, I'm not enthusiastic about "badge" systems to identify sites (feel free to insert obligatory line from "Treasure of the Sierra Madre" in this space). Most such badges are too easy to abuse or misuse, and in some cases are an invitation to ultimately failed (but still a hassle) imposed content filtering.
My bottom line on all this is actually pretty much what it was way back in Usenix days. In the long run, human moderation systems represent the best approach I know to help avoid the sorts of problems under discussion. To the extent that they can be successfully scaled, such moderation systems are also likely to be among the key solutions to a range of intellectual property abuse dilemmas on the Internet as well.
If human nature were different, we might not be facing such choices. But human nature ain't about to change anytime soon, that's for sure.