Apple reaffirms there’s no government agency backdoor

Last week, security consultant and former iOS jailbreaker Jonathan Zdziarski made headlines with his talk, “Identifying Back Doors, Attack Points, and Surveillance Mechanisms in iOS Devices”. Here’s a link to a PDF of the slides.

Zdziarski:

Before the journalists blow this way out of proportion, this was a talk I gave to a room full of hackers explaining that while we were sleeping, this is how some features in iOS have evolved over the PAST FEW YEARS, and of course a number of companies have taken advantage of some of the capabilities. I have NOT accused Apple of working with NSA, however I suspect (based on released documents) that some of these services MAY have been used by NSA to collect data on potential targets. I am not suggesting some grand conspiracy; there are, however, some services running in iOS that shouldn’t be there, that were intentionally added by Apple as part of the firmware, and that bypass backup encryption while copying more of your personal data than ever should come off the phone for the average consumer. I think at the very least, this warrants an explanation and disclosure to the some 600 million customers out there running iOS devices. At the same time, this is NOT a zero day and NOT some widespread security emergency. My paranoia level is tweaked, but not going crazy. My hope is that Apple will correct the problem. Nothing less, nothing more. I want these services off my phone. They don’t belong there.

Apple responded to Zdziarski’s comments and presentation with this comment, posted on Twitter by Financial Times’ Tim Bradshaw:

“We have designed iOS so that its diagnostic functions do not compromise user privacy and security, but still provides needed information to enterprise IT departments, developers and Apple for troubleshooting technical issues. A user must have unlocked their device and agreed to trust another computer before that computer is able to access this limited diagnostic data. The user must agree to share this information, and data is never transferred without their consent.”

As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services.”

Rene Ritchie at iMore created a nice summary of Zdziarski’s concerns:

When you connect your iPhone or iPad to iTunes on Mac or Windows — and choose to trust that computer — a pairing record is created that maintains that trust for future connections. Zdziarski claims that if someone takes physical possession of that computer, they can steal those pairing records, connect to your device, and retrieve your personal information and/or enable remote logging. If they don’t have your computer, Zdziarski claims they can try and generate a pairing record by tricking you into connecting to a compromised accessory, like a dock (juice jacking), and/or by using mobile device management (MDM) tools intended for enterprise to get around safeguards like Apple’s Trusted Device requestor.

From an article we posted last year on juice jacking:

When you plug your smart phone into a USB cable, your device will try to pair with the device on the other end of the cable. If the only thing on the other end of the line is your personally owned USB charger, no worries. But if you plug into a public charging station or a stranger’s USB charger, you are opening yourself up to malware. The device on the other end can pair with your phone and cause all sorts of mischief.

This is all about trusted pairing. Apple is making the point that they’ve bottlenecked trusted pairing so that a user needs to agree to the pairing before data access is allowed.



  • http://tewha.net/ Steven Fisher

    I think Apple’s security measures here are adequate.

    However, without a way to delete paring records on the phone and a manageable “never allow” list, they’re only adequate.

  • matthewmaurice

    When I heard this story on Ken Ray’s podcast I couldn’t help but think, “does this guy know what the words he’s using mean?” He says he doesn’t think there’s a grand conspiracy, but he DOES think the NSA used extra services that Apple didn’t NEED to put in but DID anyway? Hello, that’s the definition of conspiracy.

    Of course my first warning should have been this jewel of a sentence; “…this was a talk I gave to a room full of hackers explaining that while we were sleeping, this is how some features in iOS have evolved over the PAST FEW YEARS…” Wait, so hackers were sleeping, for years?

    http://youtu.be/G2y8Sx4B2Sk

  • SockRolid

    The greater danger to us boring civilians isn’t the NSA snooping on us. It’s jailbreaker “hackers.” For every white hat hacker like Jonathan Zdziarski, I’m sure there are 10 black hat hackers who are actively trying to exploit any possibly security loophole in jailbroken iOS devices. Because there’s no curation on jailbroken devices.

  • http://ferebee.net Chris Ferebee

    Apple can and should do more. Forensically relevant processes that serve no useful purpose for the end user should be optional. Enterprise IT could always enable them through a deployment profile if required.

    And if I take my iPhone to the Genius Bar, and they need diagnostic information, and I find I can’t enable forensics through the Settings app because something is stuck: time for Wipe and Restore.

  • JohnDoey

    There are iPhone/iPad charging cables that don’t have data lines in them, so that you can charge your device in a public charger without making a data connection.

    • http://www.laugh-eat.com/ kyron

      can the average consumer readily discern this?