When The State Isn’t Wrecking The Technology Industry It’s Begging It For Help

Do you know what’s especially funny about the fight between Apple and the Federal Bureau of Investigations (FBI)? While one part of the State is trying to destroy computer security another part is begging for help:

Carter will visit a Pentagon outpost in the heart of Silicon Valley, speak at a cybersecurity conference in San Francisco and go to Microsoft and Amazon headquarters in Seattle to highlight the risks of cyberattacks and the need for greater digital cooperation with the Pentagon.

His visit to the West Coast — his third in less than a year, more than he’s made to Kabul or Baghdad — marks the latest effort by the Obama administration to recruit telecommunications, social media and other technology companies as partners in national security operations despite deep suspicion in Silicon Valley about government surveillance.

Statism in a nutshell. When computer security stands in the way of the State’s power it attempts to crush it mercilessly. But when it needs computer security to solidify and maintain its power it comes crawling back to the very people it tried to execute only a short while ago.

In the end the State wants the best of both worlds. It wants a world where its networks and devices are secure but nobody else’s are. Why should security professionals provide the State any assistance when it constantly tries to bite their hands?

When Idiots Write About Computer Security

People trying to justify the Federal Bureau of Investigation’s (FBI) demands of Apple are possibly the most amusing thing about the agency’s recent battle with Apple. Siding with the FBI requires either being completely ignorant of security or being so worshipful of the State that you believe any compromise made in the name empowering it is justified.

A friend of mine posted an article that tries to justify the FBI’s demands by claiming Apple is spreading fear, uncertainty, and disinformation (FUD). Ironically, the article is FUD. In fact it’s quite clear that the author has little to no understanding of security:

In its campaign, Apple is mustering all the fear, uncertainty and doubt it can. In an open letter to its customers, it states that “the government would have us write an entirely new operating system for their use. They are asking Apple to remove security features and add a new ability to the operating system to attack iPhone encryption, allowing a passcode to be input electronically. … It would be wrong to intentionally weaken our products with a government-ordered backdoor.” The FUD factor in that statement is “weaken our products.” It is grossly misleading, the plural suggesting that the FBI wants Apple to make this back door a standard part of iPhones. That’s flat-out false. What the government has asked is that Apple modify software to remove a feature that was not present in earlier versions of the software, and then install that new software on the single phone used by the terrorist. Apple can then destroy the software.

Apple’s statement is entirely accurate. The FBI is demanding a signed version of iOS that removes security features and includes a mechanism to brute force the password used to encrypt the contents of the device. Because the firmware would be signed it could be loaded onto other iPhones. We also know the FBI has about a dozen more phones it wants Apple to unlock so this case isn’t about a single phone. This case is about setting a precedence that will make it easier for the State to coerce companies into bypassing the security features of their own products.

The claim that Apple can destroy the software is also naive. In order to unlock the device the software must be loaded onto the phone. Since the phone is evidence it must be returned to the FBI. That means the FBI will have a signed copy of the custom firmware sitting on the phone and the phone will be unlocked so it would be feasible for the FBI to extract the firmware. Furthermore, the process involved in writing software for a court case will likely involve several third parties receiving access to the firmware:

Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify.

[…]

During trial, the court will want to see what kind of scientific peer review the tool has had; if it is not validated by NIST or some other third party, or has no acceptance in the scientific community, the tool and any evidence gathered by it could be rejected.

[…]

If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.

It will likely be impossible for Apple to maintain exclusive control over the firmware.

Once the genie is out of the bottle it can’t be put back in. This is especially true with software since it can be reproduced almost infinitely for costs so small they’re practically free. If Apple produces this firmware it will not be able to make it not exist afterward. Let’s continue with the article in question:

More contradictory to Apple’s claims is that the FBI has specifically stated that it does not intend to cause a weakening of the consumer product, so this case cannot be used as a precedent. Should the government at any time attempt to do that so that back doors to be embedded in products, its own words would be the most compelling argument to counter that.

The FBI claims a lot of things. That doesn’t make those claims true. By merely existing this firmware would make consumer products less secure. Currently the iPhone’s security is quite strong as noted by the fact that the FBI has been unable to break into about a dozen phones in its possession. If Apple releases a firmware that can bypass security features on iPhones it necessarily means the overall security of iPhones, which are consumer products, is weakened. There is no way to logically argue otherwise. When something that couldn’t be broken into can be broken into it is less secure than it was. The fact that I felt the need to write the previous sentence causes me great pain because it speaks so ill of the education of the author.

The FUD continues, with Apple saying, “Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case.” That might very well be the case. But it has zero relevance. Each of those cases could be resolved only with a court order of its own, regardless of what happens with the San Bernardino iPhone. Even if this case were not in front of the court at the moment, any state, local or federal law enforcement agency could bring a similar case forward.

Actually, it’s entirely relevant. The FBI wants the court precedence so prosecutors in other cases can compel companies to bypass security features on their products. Apple isn’t simply fighting the creation of a purposely broken firmware, it’s fighting a precedence that would allow other courts to coerce companies into performing labor against their will. Obviously the author’s understanding of the legal system, specifically how precedence works, is as lacking as his understanding of security.

Gaining access to locked data is a legitimate law enforcement issue, and whatever your personal beliefs, all law enforcement officers have a responsibility to attempt to collect all information that is legally possible to collect.

While law enforcers may have a responsibility to attempt to collect all information within their power to collect that doesn’t mean they should be able to compel others to assist them at the point of a gun.

In other forums, Apple has been claiming that if the U.S. requires Apple to cooperate in providing access to the phone, all other governments around the world will then expect the same sort of cooperation. It is a bogus claim — more FUD. Do Apple’s lawyers really not know that the law of one country does not apply to another? Apple’s winning its case in the U.S. would do nothing to stop another country from initiating a similar action. Its losing its case should have no influence on whether other countries decide to pursue such matters.

I see the author doesn’t pay attention to world events. Oftentimes when a government sees another government get away with something nasty it decides it can also get away with it. Take Blackberry, for example. India demanded that Blackberry give it access to a backdoor and Blackberry complied. Seeing India getting what it wanted the government of Pakistan demanded the same. Monkey see, monkey do. It should be noted that Blackberry actually left Pakistan but it was obviously for reasons other than the backdoor demands.

Apple knows that if it rolls over it will encourage other governments to demand the same as the FBI. If, however, it digs its heels in it knows that it will discourage other governments from demanding the same. This is the same principle as not negotiating with terrorists. If you give in once it will encourage others to pull the same shit against you.

But of all of Apple’s arguments, the one that is most ludicrous, or perhaps the most damning of its much-touted security prowess, is revealed in this response to the government’s request for a key that could unlock one phone:

“Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.”

First, Apple is already relentlessly attacked by hackers and criminals. I would like to hope that Apple has better security practices than the IRS. But when you unpack this statement, you are left with the impression that we should not trust any of Apple’s software or products. You have to assume that, should Apple write the software that the FBI wants, it would be among the most protected software in the company. If Apple is concerned about this software being compromised, what does that say about all of its other software?

This is another claim that can only be made by somebody who doesn’t understand security. This firmware wouldn’t be entirely in Apple’s hands. As noted above, the FBI would possess a phone with the firmware installed on it. And anybody who has paid attention to the various congressional hearings on the numerous federal network breaches knows the federal government’s network is incapable of protecting anything of value.

This firmware isn’t like a private key, which can serve its purpose even if you keep it within your exclusive control. It’s a piece of software that must be loaded onto a device that is evidence in a crime, which necessarily means it must leave your exclusive control. So Apple’s security isn’t the only cause for concern here.

Even assuming that a bad guy gets hold of just the software that law enforcement wants created, it would have to be signed by Apple’s security certificate to load on any phone.

Which the copy on the phone and any copies sent out for independent testing would be.

If the criminal gets a copy of the software and it has already been signed with the certificate, Apple could revoke the certificate.

If the author read the Electronic Frontier Foundation’s (EFF) excellent technical overview of this case he would know that the public key is built into the hardware of the iPhone. This is actually a smart security practice because it prevents malware from replacing the public key. If the public key was replaced it would allow malware to load its own code. The downside to this is that Apple can’t revoke the public key to prevent software signed with the corresponding private key from loading.

But if a bad guy gets hold of Apple’s digital certificate, then the whole Apple software base is at risk, and this feature that the FBI wants bypassed is irrelevant. After all, Apple has stated that it is not immune from attack, and it has implied it is a reasonable concern that its most protected software can be compromised.

I’m going to take this opportunity to write about a specific feature of public key cryptography that is relevant here. Public key cryptography relies on two keys: a private key and a public key. The private key, as the name implies, can be kept private. Anything signed with the private key can be verified by the public key. Because of this you only need to hand out the public key.

I have a Pretty Good Privacy (PGP) key that I use to encrypt and sign e-mails. Anybody with my public key can validate my signature but they cannot sign an e-mail as me. If, however, they had my private key they could sign e-mails as me. Because of this I keep my private key very secure. Apple likely keeps its software signing key in a vault on storage media that is only ever connected to a secure computer that has no network connectivity. Under such circumstances an attacker with access to Apple’s network would still be unable to access the company’s software signing key. For reasons I stated earlier, that’s not a model Apple can follow with the firmware the FBI is demanding. Apple’s security concerns in this case are entirely unrelated to the security practices of its private key.

In addition to his technical incompetence, the author decided to display his argumentative incompetence by closing his article with a pretty pathetic ad hominid:

But Apple, seeming to take a page from Donald Trump’s presidential campaign, is using the situation to promote its brand with free advertising.

If all else fails in your argument just compare your opponent to Trump.

Apple Gives The Feds Another Middle Finger


Me right now.

A lot of people are claiming Apple’s decision to fight the Federal Bureau of Investigations (FBI) is nothing more than a marketing play. But I swear that I can hear Tim Cook yelling, “Fuck the police!” because his company keeps making announcements that it’s going to make its products more secure:

WASHINGTON — Apple engineers have begun developing new security measures that would make it impossible for the government to break into a locked iPhone using methods similar to those now at the center of a court fight in California, according to people close to the company and security experts.

[…]

The company first raised the prospect of a security update last week in a phone call with reporters, who asked why the company would allow firmware — the software at the heart of the iPhone — to be modified without requiring a user password.

One senior executive, speaking on the condition of anonymity, replied that it was safe to bet that security would continue to improve. Separately, a person close to the company, who also spoke on the condition of anonymity, confirmed this week that Apple engineers had begun work on a solution even before the San Bernardino attack. A company spokeswoman declined to comment on what she called rumors and speculation.

Independent experts say they have held informal conversations with Apple engineers over the last week about the vulnerability. Exactly how Apple will address the issue is unclear. Security experts who have been studying Apple’s phone security say it is technically possible to fix.

In addition to senior executives talking about upcoming security enhancements, Apple has also added an interesting figure to its payroll:

Frederic Jacobs, for those who don’t know, was one of the developer of the iOS version of Signal, the secure messaging application created by Open Whisper Systems that I highly recommend.

It seems to me that Apple is doing more than marketing here. The company seems dedicated to offering a secure product to its customers. My greatest hope is that this encourages other companies to follow suit.

Backup Locally

There is no cloud, there are only other people’s computers. This is a phrase you should have tattooed to the inside of your eyelids so you can contemplate it every night. It seems like every company is pushing people to store their data in “the cloud.” Apple has iCloud, Google has its Cloud Platform, Microsoft has Azure, and so on. While backing up to “the cloud” is convenient it also means your data is sitting on somebody else’s computer. In all likelihood that data was uploaded in plaintext as well so it’s readable to the owner of the server.

I have good news though! You don’t have to upload your data to somebody else’s computer! If you use an iPhone it’s actually very easy to make local backups:

If you’re looking for comprehensive privacy, including protection from law enforcement entities, there’s still a loophole here: iCloud. Apple encourages the use of this service on every iPhone, iPad, and iPod Touch that it sells, and when you do use the service, it backs up your device every time you plug it into its power adapter within range of a known Wi-Fi network. iCloud backups are comprehensive in a way that Android backups still aren’t, and if you’ve been following the San Bernardino case closely, you know that Apple’s own legal process guidelines (PDF) say that the company can hand iMessages, SMS/MMS messages, photos, app data, and voicemail over to law enforcement in the form of an iOS device backup (though some reports claim that Apple wants to strengthen the encryption on iCloud backups, removing the company’s ability to hand the data over to law enforcement).

For most users, this will never be a problem, and the convenience of iCloud backups and easy preservation of your data far outweigh any risks. For people who prefer full control over their data, the easiest option is to stop using iCloud and use iTunes instead. This, too, is not news, and in some ways is a regression to the days before iOS 5 when you needed to use a computer to activate, update, and back up your phone at all. But there are multiple benefits to doing local backups, so while the topic is on everyone’s mind we’ll show you how to do it (in case you don’t know) and what you get from it (in case you don’t know everything).

I backup my iPhone locally and you should too. My local backups are encrypted by iTunes and are stored on fully encrypted hard drives, which is a strategy I also encourage you to follow. Besides enhancing privacy by not making my data available to Apple and any court orders it receives this setup also prevents my data from being obtained if Apple’s iCloud servers are breached (which has happened).

iPhones aren’t the only devices that can be backed up locally. Most modern operating systems have built-in backup tools that clone data to external hard drives. These are far superior backup tools in my opinion than “cloud” backup services. If you backup to fully encrypted hard drives you ensure that your data isn’t easily accessible to unauthorized parties. And you can store some of your encrypted backup drives offsite, say at your parents’ house or place of work, to ensure everything isn’t lost if your house burns to the ground.

Don’t rely on other people’s computers.

When Your Employer Hears About The Concept Of Defense In Depth

What happens when your employer first hears about the concept of defense in depth but knows jack shit about firearms? This:

After each employee at Lance Toland Associates gets their license, Toland presents them with a gun known as the judge. He says it is one of the most effective self-defense weapons and all his aviation insurance agencies carry them openly in the office.

“Everybody has one of these in their drawer or on their person. I would not want to come into one of my facilities,” Toland said. “It’s a 5 shot .410, just like a shotgun and you call it hand cannon.”

Having armed employees is a great way to bolster the physical security of your workplace. But the Taurus Judge is not a good weapon to arm employees with. It is ridiculously large, only has five shots, takes much longer to reload than a semiautomatic handgun. “But, Chris,” I hear you saying, “It shoots both .410 shotgun shells and .45 Colt!” To that I will point out that better guns are available for both. In addition to that the Taurus has a rifled barrel, which causes shot to fly out in a doughnut patter.

This is one of those stories where I really want to give the employer credit for thinking about the security of his employees but find myself having to shake my head because he chose a firearm based on Hollywood specifications (it looks scary) instead of effective specifications (such as a 9mm semiautomatic handgun). Granted, a Judge is better than nothing but if you’re going to encourage your employees to have a firearm you should take the extra step to equip them with something better than simply being better than nothing.

It’s Not Just Once iPhone The FBI Wants Unlocked

There are people siding with the Federal Bureau of Investigations (FBI) in its current court battle with Apple. These misguided souls are claiming, amongst other nonsensical things, that the FBI only wants a single iPhone unlocked. They believe that it’s somehow OK for Apple to open Pandora’s box by releasing a signed firmware with a backdoor in it so long as it’s only for unlocking a single iPhone. Unfortunately, as those of us siding with Apple have been pointing out, this case isn’t about a single iPhone. The FBI wants a court precedence so it can coerce Apple into unlocking other iPhones:

In addition to the iPhone used by one of the San Bernardino shooters, the US government is pursuing court orders to force Apple to help bypass the security passcodes of “about a dozen” other iPhones, the Wall Street Journal reports. The other cases don’t involve terror charges, the Journal’s sources say, but prosecutors involved have also sought to use the same 220-year-old law — the All Writs Act of 1789 — to access the phones in question.

By setting a precedence in the San Bernardino case the FBI would have grounds to coerce Apple, and other device manufacturers, to unlock other devices. We know the FBI already has a dozen or so phones in the pipeline and it will certainly have more in the coming years.

Besides the precedence there is also the problem of the firmware itself. If Apple creates a signed firmware that disables iOS security features and automates brute forcing passwords it could be installed on other iPhones (at least other iPhone 5Cs but possibly other iPhone). With this firmware in hand the FBI wouldn’t even need to coerce Apple into helping each time, the agency could simply install the firmware on any compatible devices itself. This is why Apple believes creating such a firmware is too dangerous.

You can never believe the government when it claims to be taking an exceptional measure just once. Those exceptional measures always become standard practice.

The Abysmal State Of Credit Card Security

Credit card fraud is a major problem. This isn’t surprising since until recently, at least here in the United States, credit cards included no security. Hoping to reduce fraud the credit card companies developed the Europay, Mastercard, and Visa (EMV) standard. Cards that comply with the EMV standard include a chip, which offers some security. But here in the United States two setbacks have prevent EMV from delivering better credit card security. First, the United States is adopting chip and signature, not chip and PIN. Secondly, most merchants still aren’t equipped to process EMV credit cards:

This week a management consulting company called The Strawhecker Group (TSG) released the results of a study that found that only 37 percent of US retailers were ready to process chip-embedded credit and debit cards. The slow adoption of chip-embedded cards leaves merchants open to accepting liability for fraud perpetrated with traditional, less-secure magnetic stripe cards.

I attribute this low adoption rate to the credit card companies failing to set a hard cutoff date for magnetic strips. Even if you get an EMV card it will contain an insecure magnetic strip so it can be used at merchants that aren’t setup to process EMV cards. Since all EMV cards are equipped with magnetic strips merchants aren’t motivated to get setup to process EMV cards.

When it comes to security hard cutoff dates are necessary. Without them users of the old insecure standard see no reason to upgrade. With them users grumble about having to upgrade but will begrudgingly do it out of necessity. Credit card companies need to set a date and tell merchants that after that date magnetic swipe transactions will be declined otherwise we’ll never get over this financial fraud fuckery.

Bill Gates Sides With The FBI

Microsoft has always enjoy a cozy relationship with the State. This isn’t surprising to anybody who has paid attention to Bill Gates and his ongoing love affair with the State. It’s also not surprising that he is siding with the Federal Bureau of Investigations (FBI) against Apple:

Technology companies should be forced to cooperate with law enforcement in terrorism investigations, Gates said, according to a Financial Times story posted late Monday.

“This is a specific case where the government is asking for access to information. They are not asking for some general thing, they are asking for a particular case,” he said.

This statement by Gates is laughable. The FBI is demanding Apple create a custom signed version of iOS that doesn’t include several security features and includes builtin software to brute force the decryption key set by the user. That is not a general thing for a particular case, that’s a general tool that can used on many iPhones.

What is funny about this though is that Bill Gates tried to backpedal but in so doing only said exactly the same thing over again:

In an interview with Bloomberg, Bill Gates says he was “disappointed” by reports that he supported the FBI in its legal battle with Apple, saying “that doesn’t state my view on this.”

Still, Gates took a more moderate stance than some of his counterparts in the tech industry, not fully backing either the FBI or Apple but calling for a broader “discussion” on the issues. “I do believe that with the right safeguards, there are cases where the government, on our behalf — like stopping terrorism, which could get worse in the future — that that is valuable.” But he called for “striking [a] balance” between safeguards against government power and security.

Any “balance” would require Apple to create firmware that includes a backdoor for government use. In other words, it would require exactly what the FBI is demanding of Apple.

Cryptography is math and math belongs to that very small category of things that are either black or white. Either the cryptography you’re using is effective and only allows authorized parties to access the unencrypted content or it is ineffective. There is no middle ground. You cannot break cryptography just a little bit.

Although the existence of a version of iOS with a backdoor is frightening in of itself, the idea that a single judge can enslave software developers by issuing a writ is terrifying. That’s an aspect of this case that is getting glossed over a lot. Apple has already publicly stated it has no desire to write a weakened version of iOS. If the court sides with the FBI it will try to force Apple to write software against its will. Why should any individual have the power to legally do that?

Google Releases RCS Client. It’s Backdoored.

With the recent kerfuffle between Apple and the Federal Bureau of Investigations (FBI) the debate between secure and insecure devices is in the spotlight. Apple has been marketing itself as a company that defends users’ privacy and this recent court battle gives merits to its claims. Other companies have expressed support for Apple’s decision to fight the FBI’s demand, including Google. That makes this next twist in the story interesting.

Yesterday Christopher Soghoian posted the following Tweet:

His Tweet linked to a comment on a Hacker News thread discussing Google’s new Rich Communication Services (RCS) client, Jibe. What’s especially interesting about RCS is that it appears to include a backdoor as noted in the Hacker News thread:

When using MSRPoTLS, and with the following two objectives allow compliance with legal interception procedures, the TLS authentication shall be based on self-signed certificates and the MSRP encrypted connection shall be terminated in an element of the Service Provider network providing service to that UE. Mutual authentication shall be applied as defined in [RFC4572].

It’s important to note that this doesn’t really change anything from the current Short Message Service (SMS) service and cellular voice protocols, which offers no real security. By using this standard Google isn’t introducing a new security hole. However, Google also isn’t fixing a known security hole.

When Apple created iMessage and FaceTime it made use of strong end-to-end encryption (although that doesn’t protect your messages if you back them up to iCloud). Apple’s replacement for SMS and standard cellular calls addressed a known security hole.

Were I Google, especially with the security debate going on, I would have avoided embracing RCS since it’s insecure by default. RCS may be an industry standard, since it’s managed by the same association that manages Global System for Mobile Communications (GSM), but it’s a bad standard that shouldn’t see widespread adoption.

Political Campaigns Suck At Protecting Your Personal Information

I don’t need more reasons to abandon politics but I realize others do. To that end I feel that it’s important to point out the abysmal security record of political campaigns:

Over the last three months, more than 100 million US voters have had their data exposed online. These data breaches weren’t caused by a sophisticated hack or malware. Instead, political campaigns’ abysmal cybersecurity practices are to blame. Although modern campaigns constantly acquire and purchase massive amounts of data, they often neglect to fully beef up security surrounding it, effectively turning the campaigns into sitting ducks — huge operations with databases left open and vulnerable.

[…]

That might be unsettling, but perhaps more troubling is the fact that political campaigns are terrible at cybersecurity. Not only do the organizations have access to more information than ever before, they’re not able to keep it safe. The incentives to do so just don’t exist, and that’s why we’re seeing so much compromised voter data.

In Iowa last month, the state’s Republican party failed to adequately protect a database containing information on 2 million voters, making it readily available through just a basic scan of the website’s source code. In December, an independent security researcher uncovered a publicly available database of 191 million voter records. Included in that trove was each voter’s full name, home address, mailing address, unique voter ID, state voter ID, gender, date of birth, phone number, date of registration, political affiliation, and voter history since 2000.

I’ve mentioned these sorts of issues to friends before but they always hid behind the “I give campaigns a fake phone number” excuse. But the phone number you gave to a campaign isn’t what’s getting out, it’s your real personal information including your home address.

Politics is continuing to become more polarizing in this country. Both parties have become religions where disagreements with the party being tantamount to heresy. True believers are often willing to shun former friends and family members. Some employers are even willing to avoid hiring or terminating employees based on their form of political worship. There are no signs indicating this trend will cease or reverse so your voting record could become a major problem in the near future.

The amount of personal information many campaigns have on individuals is rather shocking. It’s often enough information for people with access to commit acts of identify theft.

There really isn’t anything to gain for political participation and there’s a lot to lose. Control over your personal information is one of the things you could potentially lose. My advise is to avoid politics since it’s obvious campaigns have no interest in protecting you.