When Idiots Write About Computer Security

People trying to justify the Federal Bureau of Investigation’s (FBI) demands of Apple are possibly the most amusing thing about the agency’s recent battle with Apple. Siding with the FBI requires either being completely ignorant of security or being so worshipful of the State that you believe any compromise made in the name empowering it is justified.

A friend of mine posted an article that tries to justify the FBI’s demands by claiming Apple is spreading fear, uncertainty, and disinformation (FUD). Ironically, the article is FUD. In fact it’s quite clear that the author has little to no understanding of security:

In its campaign, Apple is mustering all the fear, uncertainty and doubt it can. In an open letter to its customers, it states that “the government would have us write an entirely new operating system for their use. They are asking Apple to remove security features and add a new ability to the operating system to attack iPhone encryption, allowing a passcode to be input electronically. … It would be wrong to intentionally weaken our products with a government-ordered backdoor.” The FUD factor in that statement is “weaken our products.” It is grossly misleading, the plural suggesting that the FBI wants Apple to make this back door a standard part of iPhones. That’s flat-out false. What the government has asked is that Apple modify software to remove a feature that was not present in earlier versions of the software, and then install that new software on the single phone used by the terrorist. Apple can then destroy the software.

Apple’s statement is entirely accurate. The FBI is demanding a signed version of iOS that removes security features and includes a mechanism to brute force the password used to encrypt the contents of the device. Because the firmware would be signed it could be loaded onto other iPhones. We also know the FBI has about a dozen more phones it wants Apple to unlock so this case isn’t about a single phone. This case is about setting a precedence that will make it easier for the State to coerce companies into bypassing the security features of their own products.

The claim that Apple can destroy the software is also naive. In order to unlock the device the software must be loaded onto the phone. Since the phone is evidence it must be returned to the FBI. That means the FBI will have a signed copy of the custom firmware sitting on the phone and the phone will be unlocked so it would be feasible for the FBI to extract the firmware. Furthermore, the process involved in writing software for a court case will likely involve several third parties receiving access to the firmware:

Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify.

[…]

During trial, the court will want to see what kind of scientific peer review the tool has had; if it is not validated by NIST or some other third party, or has no acceptance in the scientific community, the tool and any evidence gathered by it could be rejected.

[…]

If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.

It will likely be impossible for Apple to maintain exclusive control over the firmware.

Once the genie is out of the bottle it can’t be put back in. This is especially true with software since it can be reproduced almost infinitely for costs so small they’re practically free. If Apple produces this firmware it will not be able to make it not exist afterward. Let’s continue with the article in question:

More contradictory to Apple’s claims is that the FBI has specifically stated that it does not intend to cause a weakening of the consumer product, so this case cannot be used as a precedent. Should the government at any time attempt to do that so that back doors to be embedded in products, its own words would be the most compelling argument to counter that.

The FBI claims a lot of things. That doesn’t make those claims true. By merely existing this firmware would make consumer products less secure. Currently the iPhone’s security is quite strong as noted by the fact that the FBI has been unable to break into about a dozen phones in its possession. If Apple releases a firmware that can bypass security features on iPhones it necessarily means the overall security of iPhones, which are consumer products, is weakened. There is no way to logically argue otherwise. When something that couldn’t be broken into can be broken into it is less secure than it was. The fact that I felt the need to write the previous sentence causes me great pain because it speaks so ill of the education of the author.

The FUD continues, with Apple saying, “Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case.” That might very well be the case. But it has zero relevance. Each of those cases could be resolved only with a court order of its own, regardless of what happens with the San Bernardino iPhone. Even if this case were not in front of the court at the moment, any state, local or federal law enforcement agency could bring a similar case forward.

Actually, it’s entirely relevant. The FBI wants the court precedence so prosecutors in other cases can compel companies to bypass security features on their products. Apple isn’t simply fighting the creation of a purposely broken firmware, it’s fighting a precedence that would allow other courts to coerce companies into performing labor against their will. Obviously the author’s understanding of the legal system, specifically how precedence works, is as lacking as his understanding of security.

Gaining access to locked data is a legitimate law enforcement issue, and whatever your personal beliefs, all law enforcement officers have a responsibility to attempt to collect all information that is legally possible to collect.

While law enforcers may have a responsibility to attempt to collect all information within their power to collect that doesn’t mean they should be able to compel others to assist them at the point of a gun.

In other forums, Apple has been claiming that if the U.S. requires Apple to cooperate in providing access to the phone, all other governments around the world will then expect the same sort of cooperation. It is a bogus claim — more FUD. Do Apple’s lawyers really not know that the law of one country does not apply to another? Apple’s winning its case in the U.S. would do nothing to stop another country from initiating a similar action. Its losing its case should have no influence on whether other countries decide to pursue such matters.

I see the author doesn’t pay attention to world events. Oftentimes when a government sees another government get away with something nasty it decides it can also get away with it. Take Blackberry, for example. India demanded that Blackberry give it access to a backdoor and Blackberry complied. Seeing India getting what it wanted the government of Pakistan demanded the same. Monkey see, monkey do. It should be noted that Blackberry actually left Pakistan but it was obviously for reasons other than the backdoor demands.

Apple knows that if it rolls over it will encourage other governments to demand the same as the FBI. If, however, it digs its heels in it knows that it will discourage other governments from demanding the same. This is the same principle as not negotiating with terrorists. If you give in once it will encourage others to pull the same shit against you.

But of all of Apple’s arguments, the one that is most ludicrous, or perhaps the most damning of its much-touted security prowess, is revealed in this response to the government’s request for a key that could unlock one phone:

“Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.”

First, Apple is already relentlessly attacked by hackers and criminals. I would like to hope that Apple has better security practices than the IRS. But when you unpack this statement, you are left with the impression that we should not trust any of Apple’s software or products. You have to assume that, should Apple write the software that the FBI wants, it would be among the most protected software in the company. If Apple is concerned about this software being compromised, what does that say about all of its other software?

This is another claim that can only be made by somebody who doesn’t understand security. This firmware wouldn’t be entirely in Apple’s hands. As noted above, the FBI would possess a phone with the firmware installed on it. And anybody who has paid attention to the various congressional hearings on the numerous federal network breaches knows the federal government’s network is incapable of protecting anything of value.

This firmware isn’t like a private key, which can serve its purpose even if you keep it within your exclusive control. It’s a piece of software that must be loaded onto a device that is evidence in a crime, which necessarily means it must leave your exclusive control. So Apple’s security isn’t the only cause for concern here.

Even assuming that a bad guy gets hold of just the software that law enforcement wants created, it would have to be signed by Apple’s security certificate to load on any phone.

Which the copy on the phone and any copies sent out for independent testing would be.

If the criminal gets a copy of the software and it has already been signed with the certificate, Apple could revoke the certificate.

If the author read the Electronic Frontier Foundation’s (EFF) excellent technical overview of this case he would know that the public key is built into the hardware of the iPhone. This is actually a smart security practice because it prevents malware from replacing the public key. If the public key was replaced it would allow malware to load its own code. The downside to this is that Apple can’t revoke the public key to prevent software signed with the corresponding private key from loading.

But if a bad guy gets hold of Apple’s digital certificate, then the whole Apple software base is at risk, and this feature that the FBI wants bypassed is irrelevant. After all, Apple has stated that it is not immune from attack, and it has implied it is a reasonable concern that its most protected software can be compromised.

I’m going to take this opportunity to write about a specific feature of public key cryptography that is relevant here. Public key cryptography relies on two keys: a private key and a public key. The private key, as the name implies, can be kept private. Anything signed with the private key can be verified by the public key. Because of this you only need to hand out the public key.

I have a Pretty Good Privacy (PGP) key that I use to encrypt and sign e-mails. Anybody with my public key can validate my signature but they cannot sign an e-mail as me. If, however, they had my private key they could sign e-mails as me. Because of this I keep my private key very secure. Apple likely keeps its software signing key in a vault on storage media that is only ever connected to a secure computer that has no network connectivity. Under such circumstances an attacker with access to Apple’s network would still be unable to access the company’s software signing key. For reasons I stated earlier, that’s not a model Apple can follow with the firmware the FBI is demanding. Apple’s security concerns in this case are entirely unrelated to the security practices of its private key.

In addition to his technical incompetence, the author decided to display his argumentative incompetence by closing his article with a pretty pathetic ad hominid:

But Apple, seeming to take a page from Donald Trump’s presidential campaign, is using the situation to promote its brand with free advertising.

If all else fails in your argument just compare your opponent to Trump.

Apple Gives The Feds Another Middle Finger


Me right now.

A lot of people are claiming Apple’s decision to fight the Federal Bureau of Investigations (FBI) is nothing more than a marketing play. But I swear that I can hear Tim Cook yelling, “Fuck the police!” because his company keeps making announcements that it’s going to make its products more secure:

WASHINGTON — Apple engineers have begun developing new security measures that would make it impossible for the government to break into a locked iPhone using methods similar to those now at the center of a court fight in California, according to people close to the company and security experts.

[…]

The company first raised the prospect of a security update last week in a phone call with reporters, who asked why the company would allow firmware — the software at the heart of the iPhone — to be modified without requiring a user password.

One senior executive, speaking on the condition of anonymity, replied that it was safe to bet that security would continue to improve. Separately, a person close to the company, who also spoke on the condition of anonymity, confirmed this week that Apple engineers had begun work on a solution even before the San Bernardino attack. A company spokeswoman declined to comment on what she called rumors and speculation.

Independent experts say they have held informal conversations with Apple engineers over the last week about the vulnerability. Exactly how Apple will address the issue is unclear. Security experts who have been studying Apple’s phone security say it is technically possible to fix.

In addition to senior executives talking about upcoming security enhancements, Apple has also added an interesting figure to its payroll:

Frederic Jacobs, for those who don’t know, was one of the developer of the iOS version of Signal, the secure messaging application created by Open Whisper Systems that I highly recommend.

It seems to me that Apple is doing more than marketing here. The company seems dedicated to offering a secure product to its customers. My greatest hope is that this encourages other companies to follow suit.

Backup Locally

There is no cloud, there are only other people’s computers. This is a phrase you should have tattooed to the inside of your eyelids so you can contemplate it every night. It seems like every company is pushing people to store their data in “the cloud.” Apple has iCloud, Google has its Cloud Platform, Microsoft has Azure, and so on. While backing up to “the cloud” is convenient it also means your data is sitting on somebody else’s computer. In all likelihood that data was uploaded in plaintext as well so it’s readable to the owner of the server.

I have good news though! You don’t have to upload your data to somebody else’s computer! If you use an iPhone it’s actually very easy to make local backups:

If you’re looking for comprehensive privacy, including protection from law enforcement entities, there’s still a loophole here: iCloud. Apple encourages the use of this service on every iPhone, iPad, and iPod Touch that it sells, and when you do use the service, it backs up your device every time you plug it into its power adapter within range of a known Wi-Fi network. iCloud backups are comprehensive in a way that Android backups still aren’t, and if you’ve been following the San Bernardino case closely, you know that Apple’s own legal process guidelines (PDF) say that the company can hand iMessages, SMS/MMS messages, photos, app data, and voicemail over to law enforcement in the form of an iOS device backup (though some reports claim that Apple wants to strengthen the encryption on iCloud backups, removing the company’s ability to hand the data over to law enforcement).

For most users, this will never be a problem, and the convenience of iCloud backups and easy preservation of your data far outweigh any risks. For people who prefer full control over their data, the easiest option is to stop using iCloud and use iTunes instead. This, too, is not news, and in some ways is a regression to the days before iOS 5 when you needed to use a computer to activate, update, and back up your phone at all. But there are multiple benefits to doing local backups, so while the topic is on everyone’s mind we’ll show you how to do it (in case you don’t know) and what you get from it (in case you don’t know everything).

I backup my iPhone locally and you should too. My local backups are encrypted by iTunes and are stored on fully encrypted hard drives, which is a strategy I also encourage you to follow. Besides enhancing privacy by not making my data available to Apple and any court orders it receives this setup also prevents my data from being obtained if Apple’s iCloud servers are breached (which has happened).

iPhones aren’t the only devices that can be backed up locally. Most modern operating systems have built-in backup tools that clone data to external hard drives. These are far superior backup tools in my opinion than “cloud” backup services. If you backup to fully encrypted hard drives you ensure that your data isn’t easily accessible to unauthorized parties. And you can store some of your encrypted backup drives offsite, say at your parents’ house or place of work, to ensure everything isn’t lost if your house burns to the ground.

Don’t rely on other people’s computers.

It’s Not Just Once iPhone The FBI Wants Unlocked

There are people siding with the Federal Bureau of Investigations (FBI) in its current court battle with Apple. These misguided souls are claiming, amongst other nonsensical things, that the FBI only wants a single iPhone unlocked. They believe that it’s somehow OK for Apple to open Pandora’s box by releasing a signed firmware with a backdoor in it so long as it’s only for unlocking a single iPhone. Unfortunately, as those of us siding with Apple have been pointing out, this case isn’t about a single iPhone. The FBI wants a court precedence so it can coerce Apple into unlocking other iPhones:

In addition to the iPhone used by one of the San Bernardino shooters, the US government is pursuing court orders to force Apple to help bypass the security passcodes of “about a dozen” other iPhones, the Wall Street Journal reports. The other cases don’t involve terror charges, the Journal’s sources say, but prosecutors involved have also sought to use the same 220-year-old law — the All Writs Act of 1789 — to access the phones in question.

By setting a precedence in the San Bernardino case the FBI would have grounds to coerce Apple, and other device manufacturers, to unlock other devices. We know the FBI already has a dozen or so phones in the pipeline and it will certainly have more in the coming years.

Besides the precedence there is also the problem of the firmware itself. If Apple creates a signed firmware that disables iOS security features and automates brute forcing passwords it could be installed on other iPhones (at least other iPhone 5Cs but possibly other iPhone). With this firmware in hand the FBI wouldn’t even need to coerce Apple into helping each time, the agency could simply install the firmware on any compatible devices itself. This is why Apple believes creating such a firmware is too dangerous.

You can never believe the government when it claims to be taking an exceptional measure just once. Those exceptional measures always become standard practice.

Legalizing Slavery

The United States has a long history of slavery. Since the very beginning of this country through the end of the Civil War black individuals could be owned as slaves in many states. After that the rules were changed. Private ownership of slaves was deemed illegal (a very good thing) but the State gave itself permission to enslave anybody it arbitrarily labeled as a criminal (a very bad thing). Eventually the process was streamlined and Federal Prison Industries (UNICOR) was created so manage the federally owned slaves. Individual states used this precedence to establish their own government owned corporations to managed their slaves.

Now a congressman is looking to change the rules yet again by expanding the State’s ability to own slaves. If passed, this bill will allow the State to enslave anybody by issuing a simple court order:

Sen. Richard Burr (R-North Carolina), the chairman of the Senate Intelligence Committee, reportedly will introduce legislation soon to criminalize a company’s refusal to aid decryption efforts as part of a governmental investigation. The news was first reported Thursday afternoon by the Wall Street Journal.

Aiding decryption efforts requires labor. In the San Bernardino case the Federal Bureau of Investigations (FBI) is order Apple to create a custom version of iOS that removes several key security features. Apple has refused and it has every right to do so because nobody should be compelled into performing labor against their will. If the FBI wants the phone unlocked so badly it can either put in the effort itself or hire somebody willing to try.

We’re living in interesting times. The State is seeing less and less reason to conceal its intentions.

The Party Of Fascism

I believe that getting into bed with social conservatives was one of the worst things to happen to libertarianism. Now that election season is upon us I’m reminded of this every day. Self-proclaimed libertarians are openly declaring their support for Republican frontrunners that continue to remind us that their interests aligned with fascism, not libertarianism.

The recent kerfuffle between Apple and the Federal Bureau of Investigations (FBI) is yet another demonstration of this. Using the All Writs Act, a federal court is trying to make literal slaves out of Apple’s iOS developers. Anybody who subscribed to even very basic libertarian principles would oppose this order. But a fascist, whose loyalty is to the State above all else, would support. So where does Donald Trump stand?

GOP presidential front-runner Donald Trump is insisting that Apple unlock the iPhone of one of the shooters in the San Bernardino, Calif., terrorist attack.

[…]

Trump disagreed stridently on Wednesday, calling it a matter of “common sense.”

“I agree 100 percent with the courts,” the business mogul said. “In that case, we should open it up. I think security over all — we have to open it up, and we have to use our heads. We have to use common sense.”

Donald believes Apple’s software developers are property of the State and should be compelled to write software. Let’s look at the current favorite amongst so-called libertarians, Ted Cruz (and we’ll throw in his buddy Carson as an added bonus):

Cruz said, “Apple has a serious argument” in protecting users’ privacy but said resisting the FBI’s request for help amounted to defying a search warrant. Carson said that Apple should find a way to get over mistrust of the government, but then added that might have to wait until President Obama leaves office, allowing for a delay that the FBI would probably oppose.

As if defying a terrible court order is a bad thing. My “libertarian” friends that support Cruz keep telling me he’s for small government and individual liberty but I can’t fathom how a man who thinks a court has a right to enslave software developers is for small government. Carson also demonstrates his love of government by criticizing Apple for being mistrustful of it.

Finally, just for fun, I’m going to throw in Tom “I Hate Due Process” Cotton for giggles:

“As a society, we don’t allow phone companies to design their systems to avoid lawful, court-ordered searches,” Cotton said in the statement. “If we apply a different legal standard to companies like Apple, Google, and Facebook, we can expect them to become the preferred messaging services of child pornographers, drug traffickers, and terrorists alike — which neither these companies nor law enforcement want.”

Whereas the other Republicans at least tried to sound kind of reasonable, Cotton went straight for the “messaging service of child pornographers, drug traffickers, and terrorists” line.

The Republican Party really is the party of fascism (as opposed to their close rival, the Democratic Party, which prefers its socialism be international). Not only are the policies put forth by Republican lawmakers generally fascist in nature but its members can’t help themselves when an opportunity to go on television and public declare their fascist policies presents itself. How this is supposed to be the party libertarians can prevail with is beyond me.

Apple Tells The Feds To Pound Sand

The technology industry has a long history of being run by antiauthoritarians who bark a lot but roll over as soon as Uncle Sam commands it. This has lead to a great deal of disappointment for me. Fortunately, after the Edward Snowden leaks, some technology companies have started developing a bit of a spine.

Yesterday a robed one in a court room commanded Apple to produce a custom firmware that would allow the Federal Bureau of Investigations (FBI) to more easily brute force the passcode on a suspect’s iPhone:

On Tuesday, a federal judge in Riverside, California, ordered Apple to help the government unlock and decrypt the iPhone 5C used by Syed Rizwan Farook, who shot up an office party in a terrorist attack in nearby San Bernardino in December 2015.

Specifically, United States Magistrate Judge Sheri Pym mandated that Apple provide the FBI a custom firmware file, known as an IPSW file, that would likely enable investigators to brute force the passcode lockout currently on the phone, which is running iOS 9.

By issuing this order Judge Pym openly stated that he believes Apple is a slave to the federal government and therefore can be forced to perform labor against its will. This is the point where a lot of technology companies would simply roll over and accept their place. Apple has decided it doesn’t want to play ball:

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

[…]

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

It will be interesting to see how far Apple can go in resisting this order but even if it does end up folding under the threat of government guns I want to give the company a hell of a lot of credit for this.

As Apple’s letter notes, this ruling as consequences far greater than this case alone. First, it would set a precedence that everybody is little more than a slave to the robed overlords of the courtrooms. Second, it would introduce an officially signed firmware that is purposely weakened to allow law enforcers to bypass built-in security mechanisms.

The first consequence isn’t anything new since the State has always viewed the people as slaves. But the second consequence is severe. I’m sure the FBI has pinky swore that it will never use this firmware again but anybody familiar with the agency’s history knows such a promise will be broken. And the state of the federal government’s network security means this custom firmware will almost certainly end up online at some point. Then it will be available to nongovernmental terrorists, domestic abusers, and other violent individuals with a vested interest in snooping on their targets.

Whether you like Apple or not, I believe the company deserves a lot of credit for this. I hope it inspires other companies to follow suit.

The iPhone 5S Fingerprint Reader

Yesterday Apple announced their new iPhones. The iPhone 5c was, in my opinion, wasn’t at all newsworthy. Apple’s new flagship phone, the iPhone 5s, wouldn’t be newsworthy except for its fingerprint reader:

Apple’s brand-new iPhone 5s isn’t dramatically different from last year’s model, but it has at least one major addition: a “Touch ID” sensor. Us human beings are calling it a fingerprint sensor, and it’s built into the phone’s main Home button below the screen. Apple’s Phil Schiller says, “It reads your fingerprint at an entirely new level” — it’s 170 microns in thickness with 500 ppi resolution. According to Cupertino, it “scans sub-epidermal skin layers,” and can read 360 degrees. As expected, the sensor is actually part of the Home button, making it less of a button and more of a…well, sensor. Using Touch ID, users can authorize purchases in iTunes, the App Store, or in iBooks by simply using their thumbprint (starting in iOS 7, of course). Pretty neat / scary!

Honestly, I have mixed feelings about this. It’s certainly a neat piece of technology and I don’t want to decry Apple for trying something new in the smartphone field. Today you can lock your phone with a four-digit passcode or a full password. If I were betting money I would bet that a majority of users use neither option. Of the people who put a passcode on their phone a vast majority likely opt for the four-digit option. Phones are devices that are accessed frequently. Having to enter a long password every time you want to check your Twitter feed get annoying quickly. Therefore few people are willing to use a complex password to security their phones. That leaves most people not enabling any security and those who enable security most likely opt for a relatively insecure four-digit passcode.

Apple has been fairly good about including security features that are relatively easily to use and this fingerprint reader looks to be another one. Time will tell if the sensor is easily fooled by other fingerprints but if it convinces more people to put some kind of security on their phone I’m happy. If the technology is properly implemented it could easily be more secure than the four-digit passcode (admittedly not a high barrier to climb over).

Then there’s the other side of the coin. My first thought after seeing the announcement of a fingerprint reader was that the police are going to love it. As it currently stands, a police officer wanting immediate access to your phone must obtain a search warrant and gain your cooperation, have a mechanism of exploiting a security hole in the phone on site, or bring force into things either as a threat or as physical harm. With the inclusion of a fingerprint reader a police officer need only force your finger onto the sensor to unlock it. That seems to be far less hassle than the other three mentioned options.

In light of Edward Snowden’s leaks there is also the concern that your fingerprint will be send off to the National Security Agency (NSA). While Apple promised that your fingerprint data will only be stored locally there is no way to verify that fact. Furthermore, if Apple was compelled with a national security letter to include a mechanism to allow the NSA to obtain fingerprint data they wouldn’t be legally allowed to tell us. That thought should scare everybody.

Finally, on a more practical side, biometrics have a fatal flaw: the technology is based on sensor data obtained from your body as a point in time. What happens if you cut your finger? Will the sensor detect your altered fingerprint as somebody else? What happens if your finger is cut off? Our bodies can change over time and those changes are often difficult, if not impossible, for biometric technology to detect.

As with most security technology there are ups and downs to this fingerprint reader. If it convinces more people to enable security on their phones then I will be content. However, one must realize that there are real downsides to using your fingerprint as a security token.