You Ought to Trust the Government with the Master Key

The Federal Bureau of Investigations (FBI) director, James Comey, has been waging a war against effective cryptography. Although he can’t beat math he’s hellbent on trying. To that end, he and his ilk have proposed schemes that would allow the government to break consumer cryptography. One of those schemes is call key escrow, which requires anything encrypted by a consumer device be decipherable with a master key held by the government. It’s a terrible scheme because any actor that obtains the government’s master key will also be able to decrypt anything encrypted on a consumer device. The government promises that such a key wouldn’t be compromised but history shows that there are leaks in every organziation:

A FBI electronics technician pleaded guilty on Monday to having illegally acted as an agent of China, admitting that he on several occasions passed sensitive information to a Chinese official.

Kun Shan Chun, also known as Joey Chun, was employed by the Federal Bureau of Investigation since 1997. He pleaded guilty in federal court in Manhattan to one count of having illegally acted as an agent of a foreign government.

Chun, who was arrested in March on a set of charges made public only on Monday, admitted in court that from 2011 to 2016 he acted at the direction of a Chinese official, to whom he passed the sensitive information.

If the FBI can’t even keep moles out of its organization how are we supposed to trust it to guard a master key that would likely be worth billions of dollars? Hell, the government couldn’t even keep information about the most destructive weapons on Earth from leaking to its opponents. Considering its history, especially where stories like this involving government agents being paid informants to other governments, there is no way to reasonably believe that a master key to all consumer encryption wouldn’t get leaked to unauthorized parties.

Americans aren’t Ready for Most Things

One of the worst characteristics of American society, which is probably common in most societies, is the popular attitude of resisting change. Many Americans resist automation because they’re afraid that it will take people’s jobs. Many Americans resist genetically modified crops because they think nature actually gives a shit about them and therefore produces pure, healthy foodstuffs. Many Americans resist wireless communications because their ignorance of how radiation works has convinced them that anything wireless causes cancer.

With such a history of resisting advancement I’m not at all surprised to read that most Americans are resistant to human enhancement:

Around 66 and 63 percent of the respondents even said that they don’t want to go through brain and blood enhancements (respectively) themselves. They were more receptive to the idea of genetically modifying infants, though, with 48 percent saying they’re cool with making sure newly born humans won’t ever be afflicted with cancer and other fatal illnesses. Most participants (73 percent) are also worried about biotech enhancers’ potential to exacerbate inequality. Not to mention, there are those who believe using brain implants and blood transfusions to enhance one’s capabilities isn’t morally acceptable.

The concern about exacerbating inequality really made me guffaw. Few pursuits could reduce inequality as much as biotech. Imagine a world where paralysis could be fixed with a quick spinal implant. Suddenly people who were unable to walk can become more equal with those of us who can. Imagine a world where a brain implant could help people with developmental disabilities function as an average adult. Suddenly people suffering from severe autism can function at the same level as those of us not suffering from their disability. Imagine a world where a brain implant can bypass the effects of epilepsy or narcolepsy. Suddenly people who cannot drive due to seizures or falling asleep uncontrollably can drive.

Human enhancement can do more to create equality amongst people than anything else. Physical and mental disparities can be reduced or even eliminated. Anybody who can’t see that is a fool. Likewise, any moral system that declares self-improvement immoral is absurd in my opinion. Fortunately, the future doesn’t give two shits about opinion polls and the technology will advance one way or another.

All Full-Disk Encryption isn’t Created Equal

For a while I’ve been guarded when recommending Android devices to friends. The only devices I’ve been willing to recommend are those like the Google Nexus line that receive regular security updates in a timely manner. However, after this little fiasco I don’t know if I’m willing to recommend any Android device anymore:

Privacy advocates take note: Android’s full-disk encryption just got dramatically easier to defeat on devices that use chips from semiconductor maker Qualcomm, thanks to new research that reveals several methods to extract crypto keys off of a locked handset. Those methods include publicly available attack code that works against an estimated 37 percent of enterprise users.

A blog post published Thursday revealed that in stark contrast to the iPhone’s iOS, Qualcomm-powered Android devices store the disk encryption keys in software. That leaves the keys vulnerable to a variety of attacks that can pull a key off a device. From there, the key can be loaded onto a server cluster, field-programmable gate array, or supercomputer that has been optimized for super-fast password cracking.

[…]

Beniamini’s research highlights several other previously overlooked disk-encryption weaknesses in Qualcomm-based Android devices. Since the key resides in software, it likely can be extracted using other vulnerabilities that have yet to be made public. Beyond hacks, Beniamini said the design makes it possible for phone manufacturers to assist law enforcement agencies in unlocking an encrypted device. Since the key is available to TrustZone, the hardware makers can simply create and sign a TrustZone image that extracts what are known as the keymaster keys. Those keys can then be flashed to the target device. (Beniamini’s post originally speculated QualComm also had the ability to create and sign such an image, but the Qualcomm spokeswoman disputed this claim and said only manufacturers have this capability.)

Apple designed its full-disk encryption on iOS very well. Each iOS device has a unique key referred to as the device’s UID that is mixed with whatever password you enter. In order to brute force the encryption key you need both the password and the device’s UID, which is difficult to extract. Qualcomm-based devices rely on a less secure scheme.

But this problem has two parts. The first part is the vulnerability itself. Full-disk encryption isn’t a novel idea. Scheme for properly implementing full-disk encryption have been around for a while now. Qualcomm not following those schemes puts into question the security of any of their devices. Now recommending a device involves both ensuring the handset manufacturers releases updates in a timely manner and isn’t using a Qualcomm chipset. The second part is the usual Android problem of security patch availability being hit or miss:

But researchers from two-factor authentication service Duo Security told Ars that an estimated 37 percent of all the Android phones that use the Duo app remain susceptible to the attack because they have yet to receive the patches. The lack of updates is the result of restrictions imposed by manufacturers or carriers that prevent end users from installing updates released by Google.

Apple was smart when it refused to allow the carriers to be involved in the firmware of iOS devices. Since Apple controls iOS with an iron fist it also prevents hardware manufacturers from interfering with the availability of iOS updates. Google wanted a more open platform, which is commendable. However, Google failed to maintain any real control over Android, which has left uses at the mercy of the handset manufacturers. Google would have been smart to restrict the availability of its proprietary applications to manufacturers who make their handsets to pull Android updates directly from Google.

John Brennan is an Idiot

You probably read the title of this post and wondered what Brennan did this time to piss me off. Truthfully he didn’t really piss me off this time. What he did was make a public statement that really requires being an idiot to make.

Everything old is new again. As before, the United States government is busy debating whether or not mandatory backdoors should be included in civilian encryption. Security experts have pointed out that this is a stupid idea. Crypto-anarchists have pointed out that such a law would be meaningless because the Internet has enabled global communications so finding foreign encryption algorithms that don’t include a United States backdoor would be trivial. Hoping to refute the crypto-anarchists, John Brennan made this statement:

Brennan said this was needed to counter the ability of terrorists to coordinate their actions using encrypted communications. The director denied that forcing American companies to backdoor their security systems would cause any commercial problems.

“US companies dominate the international market as far as encryption technologies that are available through these various apps, and I think we will continue to dominate them,” Brennan said.

“So although you are right that there’s the theoretical ability of foreign companies to have those encryption capabilities available to others, I do believe that this country and its private sector are integral to addressing these issues.”

Theoretical ability? Let’s have a short discussion about the Advanced Encryption Standard (AES). AES is one of the most prolific encryption standards in use today. Most full disk encryption tools, many Transport Layer Security (TLS) connections, and a load of other security tools rely on AES. Hell, many devices even include hardware acceleration for AES because it’s so heavily used. AES was originally a competition held by the National Institute of Standards and Technology (NIST) to find a modern encryption standard. In the end an algorithm called Rijndael won. Rijndael was created by Joan Daemen and Vincent Rijmen. If those two names sound foreign it’s because they are. Joan and Vincent are Belgians. So one of the most common encryption algorithms in use today, an algorithm chosen by an agency of the United States government no less, was created by two foreigners. I’d say foreign encryption tools are a bit beyond theoretical at this point.

Adding insult to injury, let’s discuss Theo de Raadt. Theo, for those who don’t know, is the creator and lead developer of both OpenBSD and OpenSSH. OpenBSD is an operating system known for being security and OpenSSH is probably the most common secure remote connection tool on the planet. Both of them are developed in Canada:

It’s perhaps easy to forget, but the cryptographic landscape was quite different in 1999. A lot has changed since then. Cryptographic software was available, but not always widespread, in part due to US export controls. International users either had to smuggle it out printed on dead trees, or reimplement everything, or settle for the 40 bit limited edition of their favorite software. Many operating systems originated in the US, so it was difficult to integrate cryptography top to bottom because there needed a way to build the export version without it. OpenBSD had the advantage of originating in Canada, without such concerns. The goto public key algorithm of choice, RSA, was encumbered by a patent for commercial use. The primary symmetric algorithm was still DES. You could use blowfish, of course, but it wasn’t officially blessed as a standard.

Again, the availability of foreign encryption tools is more than theoretical. I would think the director of the Central Intelligence Agency (CIA), which is supposedly tasked with spying on foreign countries, would be very aware of that. But the CIA has a long history of failure so it being unaware of very real encryption tools originating in foreign countries isn’t really that surprising.

The Phones Have Ears

the-walls-have-ears

Smartphone are marvelous devices but they also collect a great deal of personal information about us. Data stored locally can be encrypted but data that is uploaded to third party servers is at the mercy of the security practices of the service provider. If your mobile phone, for example, uploads precise location information to Google’s servers then Google has that information and can be compelled to provide it to law enforcers:

So investigators tried a new trick: they called Google. In an affidavit filed on February 8th, nearly a year after the initial robbery, the FBI requested location data pulled from Graham’s Samsung Galaxy G5. Investigators had already gone to Graham’s wireless carrier, AT&T, but Google’s data was more precise, potentially placing Graham inside the bank at the time the robbery was taking place. “Based on my training and experience and in consultation with other agents,” an investigator wrote, “I believe it is likely that Google can provide me with GPS data, cell site information and Wi-fi access points for Graham’s phone.”

That data is collected as the result of a little-known feature in Google Maps that builds a comprehensive history of where a user has been — information that’s proved valuable to police and advertisers alike. A Verge investigation found affidavits from two different cases from the last four months in which police have obtained court orders for Google’s location data. (Both are embedded below.) Additional orders may have been filed under seal or through less transparent channels.

This problem isn’t unique to location data on Android devices. Both Android and iOS have the ability to backup data to “the cloud” (Google and Apple’s servers respectively). While the data is encrypted in transport it is not stored in an encrypted format, at least no an encrypted format that prevents Google or Apple from accessing the data, on the servers. As Apple mentioned in the Farook case, had the Federal Bureau of Investigations (FBI) not fucked up by resetting Farook’s iCloud password, it would have been feasible to get the phone to backup to iCloud and then Apple could have provided the FBI with the backed up data. Since the backed up data contains information such as plain text transcripts of text messages the feature effectively bypasses the security offered by iMessage. Android behaves the same way when it backs up data to Google’s servers. Because of this users should be wary of using online backup solutions if they want to keep their data private.

As smartphones continue to proliferate and law enforcers realize how much data the average smartphone actually contains we’re going to see more instances of warrants being used to collect user information stored on third party servers.

The Bill Of Rights Won’t Save You

You really need to use full disk encryption on all of your electronic devices. Modern versions of OS X and Linux make it easy. Windows is a bit hit or miss as BitLocker tries its damnedest to share your key with Microsoft’s servers. iOS has included full disk encryption by default — so long as you set a password — since version 8 and Android also includes support for full disk encryption. Use these tools because the Bill of Rights won’t protect your data from government snoops:

The government can prosecute and imprison people for crimes based on evidence obtained from their computers—even evidence retained for years that was outside the scope of an original probable-cause search warrant, a US federal appeals court has said in a 100-page opinion paired with a blistering dissent.

The 2nd US Circuit Court of Appeals ruled that there was no constitutional violation because the authorities acted in good faith when they initially obtained a search warrant, held on to the files for years, and built a case unrelated to the original search.

The case posed a vexing question—how long may the authorities keep somebody’s computer files that were obtained during a search but were not germane to that search? The convicted accountant said that only the computer files pertaining to his client—who was being investigated as part of an Army overbilling scandal—should have been retained by the government during a 2003 search. All of his personal files, which eventually led to his own tax-evasion conviction, should have been purged, he argued.

From my layman’s understanding of the Fourth Amendment, it’s supposed to protect against government shenanigans such as snooping through your data that was obtained under a valid warrant but was unrelated to the case the warrant was issued for to build another case against you. Although the quote is most likely false, Mr. Bush supposedly said, “It’s just a goddamned piece of paper!” in regards to the Constitution. While the quote is probably false the statement is not.

The Constitution cannot protect you. It is literally a piece of paper with words written on it. If you want some semblance of protection against the State you have to implement it yourself. Encrypting your devices’ storage would guard against this kind of nonsense assuming you weren’t foolish enough to decrypt the data for the State at any point. This is where features such as VeraCrypt’s (a fork of TrueCrypt that is being actively developed) hidden partition feature are nice because you can have a sanitized encrypted partition that you can decrypt and a hidden partition with your sensitive data. Since the hidden partition isn’t detectable the State’s agents cannot know whether or not it exists and therefore cannot compel you to decrypt it.

Utilize the tools available to you to protect yourself. Anybody who has been paying attention to recent American history knows that the supposed legal protections we all enjoy are little more than fiction at this point.

Be Careful When Taking Your Computer In For Servicing

How many of you have taken your computer in to be repaired? How many of you erased all of your data before taking it in? I’m often amazed by the number of people who take their computer in for servicing without either replacing the hard drive or wiping the hard drive in the computer. Whenever I take any electronic device in for servicing I wipe all of the data off of it and only install an operating system with a default user account the repairer can use to log in with. When I get the device back I wipe it again and then restore my data from a backup.

Why am I so paranoid? Because you never know who might be a paid Federal Bureau of Investigations (FBI) snitch:

The doctor’s attorney says the FBI essentially used the employee to perform warrantless searches on electronics that passed through the massive maintenance facility outside Louisville, Ky., where technicians known as Geek Squad agents work on devices from across the country.

Since 2009, “the FBI was dealing with a paid agent inside the Geek Squad who was used for the specific purpose of searching clients’ computers for child pornography and other contraband or evidence of crimes,” defense attorney James Riddet claimed in a court filing last month.

Riddet represents Dr. Mark Albert Rettenmaier, a gynecological oncologist who practiced at Hoag Hospital until his indictment in November 2014 on two felony counts of possession of child pornography. Rettenmaier, who is free on bond, has taken a leave from seeing patients, Riddet said.

Because the case in this story involved child pornography I’m sure somebody will accuse me of trying to protect people who possess child pornography. But data is data when it comes to security. The methods you can use to protect your confidential communications, adult pornography, medical information, financial records, and any other data can also be used to protect illicit, dangerous, and downright distasteful data. Never let somebody make you feel guilty for helping good people protect themselves because the information you’re providing them can also be used by bad people.

Due to the number of laws on the books, the average working professional commits three felonies a day. In all likelihood some data on your device could be used to charge you with a crime. Since the FBI is using computer technicians as paid informants you should practice some healthy paranoia when handing your devices over to them. The technician who works on your computer could also have a side job of feeding the FBI evidence of crimes.

But those aren’t the only threats you have to worry about when taking your electronic devices in for servicing. I mentioned that I also wipe the device when I get it back from the service center. This is because the technician who worked on my device may have also installed malware on the system:

Harwell had been a Macintosh specialist with a Los Angeles-area home computer repair company called Rezitech. That’s how he allegedly had the opportunity to install the spy software, called Camcapture, on computers.

While working on repair assignments, the 20-year-old technician secretly set up a complex system that could notify him whenever it was ready to snap a shot using the computer’s webcam, according to Sergeant Andrew Goodrich, a spokesman with the Fullerton Police Department in California. “It would let his server know that the victim’s machine was on. The server would then notify his smartphone… and then the images were recorded on his home computer,” he said.

When your device is in the hands of an unknown third party there is no telling what they may do with it. But if the data isn’t there then they can’t snoop through it and if you wipe the device when you get it back any installed malware will be wiped as well.

Be careful when you’re handing your device over to a service center. Make sure the device has been wiped before it goes in and gets wiped when it comes back.

If It Isn’t Broken, Don’t Fix It

When it comes to effective technology the federal government has a dismal record. Recently news organizations have been flipping out over a report that noted that the federal government is still utilizing 8″ floppy disks for its nuclear weapons program:

The U.S. Defense Department is still using — after several decades — 8-inch floppy disks in a computer system that coordinates the operational functions of the nation’s nuclear forces, a jaw-dropping new report reveals.

The Defense Department’s 1970s-era IBM Series/1 Computer and long-outdated floppy disks handle functions related to intercontinental ballistic missiles, nuclear bombers and tanker support aircraft, according to the new Government Accountability Office report.

The department’s outdated “Strategic Automated Command and Control System” is one of the 10 oldest information technology investments or systems detailed in the sobering GAO report, which calls for a number of federal agencies “to address aging legacy systems.”

I’m not sure why that report is “jaw-droping.” There is wisdom in updating systems incrementally as key components become obsolete. There is also wisdom in not fixing something that isn’t broken.

This reminds me of the number of businesses and banks that still rely on software written in COBOL. A lot of people find it odd that these organizations haven’t upgraded their systems to the latest and greatest. But replacing a working system that has been debugged and fine tuned for decades is an expensive prospect. All of the work that was done over those decades is effectively thrown out. Whatever new system is developed to replace the old system will have to go through a painful period of fine tuning and debugging. Considering that and considering the current systems still fulfill their purposes, why would an organization sink a ton of money into replacing them?

The nuclear program strikes me as the same thing. While 8″ floppy disks and IBM Series/1 computers are ancient, they seem to be fulfilling their purpose. More importantly, those systems have gone through decades of fine tuning and debugging, which means they’re probably more reliable than any replacement system would be (and reliability is pretty important when you’re talking about weapons that can wipe out entire cities).

Sometimes old isn’t automatically bad, even when you’re talking about technology.

The FBI Cares More About Maintaining Browser Exploits Than Fighting Child Pornography

Creating and distributing child pornography are two things that most people seem to agree should be ruthlessly pursued by law enforcers. Law enforcers, on the other hand, don’t agree. The Federal Bureau of Investigations (FBI) would rather toss out a child pornography case than reveal one stupid browser exploit:

A judge has thrown out evidence obtained by the FBI via hacking, after the agency refused to provide the full code it used in the hack.

The decision is a symptom of the FBI using investigative techniques that are usually reserved for intelligence agencies, such as the NSA. When those same techniques are used in criminal cases, they have to stack up against the rights of defendants and are subject to court processes.

The evidence that was thrown out includes child pornography allegedly found on devices belonging to Jay Michaud, a Vancouver public schools worker.

Why did the FBI even bring the case Michaud if it wasn’t willing to reveal the exploit that the defense was guaranteed to demand technical information about?

This isn’t the first case the FBI has allowed to be thrown out due to the agency’s desperate desire to keep an exploit secret. In allowing these cases to be thrown out the FBI has told the country that it isn’t serious about pursuing these crimes and that it would rather all of us remain at the mercy of malicious hackers than reveal the exploits it, and almost certain they, rely on.

I guess the only crimes the FBI actually cares to fight are the ones it creates.

Airport Security Isn’t The Only Security The TSA Sucks At

The Transportation Security Administration (TSA) sucks at providing airport security. But the agency isn’t a one trick pony. Demonstrating its commitment to excellence — at sucking — the TSA is working hard to make its computer security just as good as its airport security:

The report centers on the the way TSA (mis)handles security around the data management system which connects airport screening equipment to centralized servers. It’s called the Security Technology Integrated Program (STIP), and TSA has been screwing it up security-wise since at least 2012.

In essence, TSA employees haven’t been implementing STIP properly — that is, when they’ve been implementing it at all.

STIP manages data from devices we see while going through security lines at airports, namely explosive detection systems, x-ray and imaging machines, and credential authentication.

[…]

In addition to unpatched software and a lack of physical security that allowed non-TSA airport employees access to IT systems, the auditors found overheated server rooms and computers using unsupported systems — and much more.

The observed “lack of an established disaster recovery capability” noted by the OIG is particularly scary. If a data center was taken out by natural disaster, passenger screening and baggage info would be rendered inaccessible.

Not only that, but there was no security incident report process in place, and there was “little employee oversight in maintaining IT systems.” And, auditors were not pleased at all that non-TSA IT contractors maintained full admin control over STIP servers at airports.

At what point do we write the TSA off as a failed experiment? I know, it’s a government agency, it’ll never go away. But the fact that the TSA continues to fail at everything and is allowed to continue existing really demonstrates why the market is superior to the State. Were the TSA forced to compete in a market environment it would have been bankrupted and its assets would have been sold to entrepreneurs who might be able to put them to use.

It’s time to ask the million dollar question. What will happen now? One of the reason government agencies fail to improve their practices is because there’s no motivation to do so. A government agency can’t go bankrupt and very rarely do failures lead to disciplinary action. In the very few cases where disciplinary action does happen it’s usually something trivial such as asking the current head of the agency to retire will full benefits.

Meanwhile air travelers will still be required to submit to the TSA, which not only means going through security theater but now potentially means having their personal information, such as images from the slave scanners, leaked to unauthorized parties.