Link
An extraordinarily clear and understandable post by Chris Coyne that explains exactly what's wrong with the idea that by protecting our data, Apple (and Google, and other service providers) are only serving to protect the guilty. In fact, they're protecting us all, and in many ways.
Beyond all the technical considerations, there is a sea change in what we are digitizing.
We whisper “I love you” through the cloud. We have pictures of our kids in the bath tub. Our teens are sexting. We fight with our friends. We talk shit about the government. We embarrass ourselves. We watch our babies on cloud cameras. We take pictures of our funny moles. We ask Google things we might not even ask our doctor.
Even our passing thoughts and fears are going onto our devices.
Time was, all these things we said in passing were ephemeral. We could conveniently pretend to forget. Or actually forget. Thanks to the way our lives have changed, we no longer have that option.
This phenomenon is accelerating. In 10 years, our glasses may see what we see, hear what we hear. Our watches and implants and security systems of tomorrow may know when we have fevers, when we're stressed out, when our hearts are pounding, when we have sex and - wow - who's in the room with us, and who's on top and what direction they're facing*. Google and Apple and their successors will host all this data.
We're not talking about documents anymore: we're talking about everything.
You should be allowed to forget some of it. And to protect it from all the dangers mentioned above.
As I increasingly use my various devices as an outboard brain (which I do, a lot), I need things to be ephermal. I need to be able to tell my outboard brain to forget stuff with only slightly more difficulty than my real brain forgets stuff. And I want to know that eg the NSA isn't creeping on stuff I've already forgotten.