A lot of people don’t understand the concept of cryptographic backdoors. This isn’t surprising because cryptography and security are very complex fields of study. But it does lead to a great deal of misunderstanding, especially amongst those who tend to trust what government agents say.
I’ve been asked by quite a few people why Apple doesn’t comply with the demands of the Federal Bureau of Investigations (FBI). They’ve fallen for the FBI’s claims that the compromised firmware would only be used on that single iPhone and Apple would be allowed to maintain total control over the firmware at all times. However, as Jonathan Zdziarski explained, the burden of forensic methodology would require the firmware to exchange hands several times:
Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify.
[…]
If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.
If Apple creates what the FBI is demanding the firmware would almost certainly end up in the hands of NIST, the defense attorney, and another third party hired by the defense attorney to verify the firmware. As Benjamin Franklin said, “Three can keep a secret, if two of them are dead.” With the firmware exchanging so many hands it will almost certainly end up leaked to the public.
After pointing this out a common followup question is, “So what? How much damage could this firmware cause?” To illustrate this I will use an example from the physical world.
The Transportation Security Administration (TSA) worked with several lock manufacturers to create TSA recognized locks. These are special locks that TSA agents can bypass using master keys. To many this doesn’t sound bad. After all, the TSA tightly guards these master keys, right? Although I’m not familiar with the TSA’s internal policies regarding the management of their master keys I do know the key patterns were leaked to the Internet and 3D printer models were created shortly thereafter. And those models produce keys that work.
The keys were leaked, likely unintentionally, by a TSA agent posting a photograph of them online. With that single leak every TSA recognized lock was rendered entirely useless. Now anybody can obtain the keys to open any TSA recognized lock.
It only takes one person to leak a master key, either intentionally or unintentionally, to render every lock that key unlocks entirely useless. Leaking a compromised version of iOS could happen in many ways. The defendant’s attorney, who may not be well versed in proper security practices, could accidentally transfer the firmware to a third party in an unsecured manner. If that transfer is being monitored the person monitoring it would have a copy of the firmware. An employee of NIST could accidentally insert a USB drive with the firmware on it into an infected computer and unknowingly provide it to a malicious actor. Somebody working for the defendant’s third party verifier could intentionally leak a copy of the firmware. There are so many ways the firmware could make its way to the Internet that the question isn’t really a matter of if, but when.
Once the firmware is leaked to the Internet it would be available to anybody. While Apple could design the firmware to check the identity of the phone to guard against it working on any phone besides the one the FBI wants unlocked, it could be possible to spoof those identifies to make any iPhone 5C look like the one the FBI wants unlocked. It’s also possible that a method to disable a fully updated iPhone 5C’s signature verification will be found. If that happens a modified version of the compromised firmware, which would contain an invalid signature, that doesn’t check the phone’s identifiers could be installed.
The bottom line is that the mere existence of a compromised firmware, a master key if you will, puts every iPhone 5C at risk just as the existence of TSA master keys put everything secured with a TSA recognized lock at risk.