In the ongoing security arms race researchers from John Hopkins discovered a vulnerability in Apple’s iMessage:
Green suspected there might be a flaw in iMessage last year after he read an Apple security guide describing the encryption process and it struck him as weak. He said he alerted the firm’s engineers to his concern. When a few months passed and the flaw remained, he and his graduate students decided to mount an attack to show that they could pierce the encryption on photos or videos sent through iMessage.
It took a few months, but they succeeded, targeting phones that were not using the latest operating system on iMessage, which launched in 2011.
To intercept a file, the researchers wrote software to mimic an Apple server. The encrypted transmission they targeted contained a link to the photo stored in Apple’s iCloud server as well as a 64-digit key to decrypt the photo.
Although the students could not see the key’s digits, they guessed at them by a repetitive process of changing a digit or a letter in the key and sending it back to the target phone. Each time they guessed a digit correctly, the phone accepted it. They probed the phone in this way thousands of times.
“And we kept doing that,” Green said, “until we had the key.”
A modified version of the attack would also work on later operating systems, Green said, adding that it would likely have taken the hacking skills of a nation-state.
With the key, the team was able to retrieve the photo from Apple’s server. If it had been a true attack, the user would not have known.
There are several things to note about this vulnerability. First, Apple did response quickly by including a fix for it in iOS 9.3. Second, security is very difficult to get right so it often turns into an arms race. Third, designing secure software, even if you’re a large company with a lot of talented employees, is hard.
Christopher Soghoian also made a good point in the article:
Christopher Soghoian, principal technologist at the American Civil Liberties Union, said that Green’s attack highlights the danger of companies building their own encryption without independent review. “The cryptographic history books are filled with examples of crypto-algorithms designed behind closed doors that failed spectacularly,” he said.
The better approach, he said, is open design. He pointed to encryption protocols created by researchers at Open Whisper Systems, who developed Signal, an instant message platform. They publish their code and their designs, but the keys, which are generated by the sender and user, remain secret.
Open source isn’t a magic bullet but it does allow independent third party verification of your code. This advantage often goes unrealized as even very popular open source projects like OpenSSL have contained numerous notable security vulnerabilities for years without anybody being the wiser. But it’s unlikely something like iMessage would have been ignored so thoroughly.
The project would likely attracted a lot of developers interested in writing iMessage clients for Android, Windows, and Linux. Since iOS, and therefore by extension iMessage, is so popular in the public eye it’s likely a lot of security researchers would have looked through the iMessage code hoping to be the first to find a vulnerability and enjoy the publicity that would almost certainly entail. So open sourcing iMessage would likely have gained Apple a lot of third party verification.
In fact this is why I recommend applications like Signal over iMessage. Not only is Signal compatible with Android and iOS but it’s also open source so it’s available for third party verification.