Apple takes a (cautious) stand against opening a killer’s iPhones

Inflammatory headline aside, this New York Times piece is chock full of interesting quotes:

Executives at Apple have been surprised by the case’s quick escalation, said people familiar with the company who were not authorized to speak publicly. And there is frustration and skepticism among some on the Apple team working on the issue that the Justice Department hasn’t spent enough time trying to get into the iPhones with third-party tools, said one person with knowledge of the matter.

And:

The stakes are high for Mr. Cook, who has built an unusual alliance with President Trump that has helped Apple largely avoid damaging tariffs in the trade war with China. That relationship will now be tested as Mr. Cook confronts Mr. Barr, one of the president’s closest allies.

And:

At the heart of the tussle is a debate between Apple and the government over whether security or privacy trumps the other. Apple has said it chooses not to build a “backdoor” way for governments to get into iPhones and to bypass encryption because that would create a slippery slope that could damage people’s privacy.

And:

Bruce Sewell, Apple’s former general counsel who helped lead the company’s response in the San Bernardino case, said in an interview last year that Mr. Cook had staked his reputation on the stance. Had Apple’s board not agreed with the position, Mr. Cook was prepared to resign, Mr. Sewell said.

And:

Mr. Cook has made privacy one of Apple’s core values. That has set Apple apart from tech giants like Facebook and Google, which have faced scrutiny for vacuuming up people’s data to sell ads.

“It’s brilliant marketing,” Scott Galloway, a New York University marketing professor who has written a book on the tech giants, said of Apple. “They’re so concerned with your privacy that they’re willing to wave the finger at the F.B.I.”

And:

A Justice Department spokeswoman said in an email: “Apple designed these phones and implemented their encryption. It’s a simple, ‘front-door’ request: Will Apple help us get into the shooter’s phones or not?”

This is a giant issue. I don’t think there’s any way for a master encryption key to be created that won’t eventually get leaked or stolen.

If such a key was created, is there a case so important that would make putting that key in the hands of the world at large worth the risk? To me, that’s the heart of the dilemma.