Apple Backdoors Itself

UPDATE (September 3, 2021): Apple has now announced that “based on feedback” they are delaying the launch of this project to “collect input and make improvements” before release.

– – –

Apple’s newly revealed plan to scan users’ Apple devices for photos and messages related to child abuse is actually fairly easy to explain from a high-level technical standpoint.

Apple has abandoned their “end-to-end” encrypted messaging promises. They’re gone. Poof! Flushed down the john. Because a communication system that supposedly is end-to-end encrypted — but has a backdoor built into user devices — is like being sold a beautiful car and discovering after the fact that it doesn’t have any engine. It’s fraudulent.

The depth of Apple’s betrayal of its users is not specifically in the context of dealing with child abuse — which we all agree is a very important issue indeed — but that by building any kind of backdoor mechanism into their devices they’ve opened the legal door to courts and other government entities around the world to make ever broader demands for secret, remote access to the data on your Apple phones and other devices. And even if you trust your government today with such power — imagine what a future government in whom you have less faith may do.

In essence, Apple has given away the game. It’s as if you went into a hospital to have your appendix removed, and when you awoke you learned that they also removed one of your kidneys and an eye. Surprise!

There is no general requirement that Apple (or other firms) provide end-to-end crypto in their products. But Apple has routinely proclaimed itself to be a bastion of users’ privacy, while simultaneously being highly critical of various other major firms’ privacy practices. 

That’s all just history now, a popped balloon. Apple hasn’t only jumped the shark, they’ve fallen into the water and are sinking like a stone to the bottom.

–Lauren–