Social Media Is Probably Doomed

UPDATE (31 December 2022): 2023 and Social Media’s Winds of Change

– – – – – –

Social media as we’ve known it is probably doomed. Whether a decline in social media would on balance be good or bad for society I’ll leave to another discussion, but the handwriting is on the wall for a major decline in social media overall.

As with most predictions, the timing and other details will surface in coming months and years, but the overall shape of things to come is not terribly difficult to visualize.

The fundamental problem is also clear enough. A vast range of entities at state, federal, and international levels are in the process of enacting, invoking, or otherwise planning a range of regulatory and other legal mandates that would apply to social media firms — with many of these requirements being in direct and total opposition to each other.

The most likely outcome from putting these firms “between a rock and hard place” will be a drastic reduction of social media services provided, resulting in a massive decrease in ordinary persons’ ability to communicate publicly, rather than the increase that various social media critics have been anticipating.

Let’s very briefly review just some of the factors in the mix:

The political Right in the U.S. generally wants public postings to stay up, even if they contain racist or other hate speech or misinformation/disinformation. This is the outline of the push from states like Texas and Florida. Meanwhile, the Left and other states like California want more of the same sort of postings taken down even faster than they are now. Unless you can somehow provide different feeds on a posting by posting basis to users in different states (and what of VPN usage from other areas?), this creates an impossible situation.

Both the Left and Right hate Section 230, but for opposite reasons, relating to my point just above. Even the Biden White House has this wrong, arguing that cutting back 230 protections would force social media firms to more tightly moderate content, when in reality tampering with 230 would make hosting most UGC (User Generated Content) far too risky.

Elon Musk has proposed that Twitter carry any postings that aren’t explicitly illegal or condoning violence. This suggests an increase in the kind of hate speech and disinformation that not only drives away many users, but also tends to cause enormous problems for potential advertisers and network infrastructure providers, who usually do not want to be associated with such materials. And then of course there’s the EU — which has its own requirements (much more robust than in the U.S.) for dealing with hate speech and misinformation/disinformation.

There are calls to strip Internet users of all anonymity, to require use of real names (tied to official IDs, perhaps through some third party mechanisms) based on the theory that this would reduce hate speech and other attack speech. Yet studies have shown that such abhorrent speech continues to flower even when real names are used, while forcing real names causes already marginalized persons and groups to be even further disadvantaged, often in dangerous ways. Is there a middle ground on this? Perhaps requiring IDs be known to a third party (in case of abuse) before posting to large numbers of persons is permitted, but still permitting the use of pseudonyms for those postings? Maybe, but it seems like a long shot. 

Concerns over posting of terrorist content, live streaming of shootings, and other nightmarish postings have increased calls for pre-moderation of content before it goes public. But at the massive scale of the large social media firms, it’s impossible to see how this could be practical, for a whole range of reasons, unless the amount of content permitted from the public were drastically reduced.

And this is just a partial list. 

For social media to have any real value and practicality, it can’t operate on a reasonable basis when every state, every country may demand a different and conflicting set of rules. While there are certainly some politicians and leaders who do understand these issues in considerable depth, many others don’t worry about whether their technical demands are practical or what the collateral damage would be, only whether they’re good for votes come the next election.

And now we reach that part of this little essay where I’m expected to announce my preferred solution to this set of problems. Well dear readers, I’ve got nothing for you. I don’t see any practical solutions for these dilemmas. The issues are in direct conflict and opposition, and there is no obvious route toward their reconciliation or harmonization. 

So I can do little more here than push the needle into the red zone, sound the storm warnings, and try to point out that the paths we’re taking — absent some almost unimaginable changes in the current patterns — are rocketing us rapidly toward a world of social media that will likely briefly flare brightly and then go dark, like an incandescent light bulb at the end of its life, turned on just one too many times.

This analogy isn’t perfect of course, and there will continue to be some forms of social media under any circumstances. But the expected experience seems most likely to become increasingly constrained over time, along with all other aspects of publicly accessible user-provided materials — the incredible shrinking content.

As I said earlier, nobody knows how long this process will take. It won’t happen overnight. But we’ll have taken the path into this wilderness of our own free will, eyes wide open.

Please don’t forget to turn off the lights on your way out.

–Lauren–