The FBI’s Performance Issues

When the Federal Bureau of Investigations (FBI) isn’t pursuing terrorists that it created, the agency tends to have a pretty abysmal record. The agency recently announced, most likely as propaganda against effective encryption, that it has failed to obtain the contents of 7,000 encrypted devices:

Agents at the US Federal Bureau of Investigation (FBI) have been unable to extract data from nearly 7,000 mobile devices they have tried to access, the agency’s director has said.

Christopher Wray said encryption on devices was “a huge, huge problem” for FBI investigations.

The agency had failed to access more than half of the devices it targeted in an 11-month period, he said.

The lesson to be learned here is that effective cryptography works. Thanks to effective cryptography the people are able to guarantee their supposed constitutional right to privacy. The restoration of rights should be celebrated but politicians never do because our rights are directly opposed to their goals. I guarantee that this announcement will lead to more political debates in Congress that will result in more bills being introduced to ban the plebs (but not the government, of course) from having effective cryptography. If one of the bills is passed into law, the plebs will have to personally patch their devices to fix the broken cryptography mandated by law (which, contrary to what politicians might believe, is what many of us plebs will do).

If you don’t want government goons violating your privacy, enable the cryptographic features on your devices such as full disk encryption.

The Future is Bright

A writer at The Guardian, which seems to be primarily known for propagating left-wing statist propaganda, has shown a slight glimmer of understanding. While neoconservatives and neoliberals fight for power over other people, crypto-anarchists have been busy working in the shadows to develop technology that allows individuals to defend themselves from the State:

The rise of crypto-anarchism might be good news for individual users – and there are plenty working on ways of using this technology for decent social purposes – but it’s also bad news for governments. It’s not a direct path, but digital technology tends to empower the individual at the expense of the state. Police forces complain they can’t keep up with new forms of online crime, partly because of the spread of freely available encryption tools. Information of all types – secrets, copyright, creative content, illegal images – is becoming increasingly difficult to contain and control. The rash of ransomware is certainly going to get worse, exposing the fragility of our always connected systems. (It’s easily available to buy on the dark net, a network of hidden websites that are difficult to censor and accessed with an anonymous web browser.) Who knows where this might end. A representative from something called “Bitnation” explained to Parallel Polis how an entire nation could one day be provided online via an uncontrollable, uncensorable digital network, where groups of citizens could club together to privately commission public services. Bitnation’s founder, Susanne Tarkowski Tempelhof, hopes Bitnation could one day replace the nation state and rid us of bureaucrats, creating “a world of a million competing digital nations”, as she later told me.

The biggest threat to statism is individual empowerment. While technology is a two-edged sword, serving both the State and individuals without concern for either’s morality, it is difficult to argue that it hasn’t greatly helped empower individuals.

A combination of Tor hidden services and cryptocurrencies have done a great deal to weaken the State’s drug war by establishing black markets where both buyers and sellers remain anonymous. Weakening the drug war is a significant blow to the State because it deprives it of slave labor (prisoners) and wealth (since the State can’t use civil forfeiture on property it can’t identify).

Tor, Virtual Private Networks (VPN), Hypertext Transfer Protocol Secure (HTTPS), Signal, and many other practical implementations of encryption have marvelously disrupted the State’s surveillance apparatus. This also cuts into the State’s revenue since it cannot issue fines, taxes, or other charges on activities it is unaware of.

3D printers, although still in their infancy, are poised to weaken the State’s ability to restrict objects. For example, the State can’t prohibit the possession of firearms if people are able print them without the State’s knowledge.

But if the State disables the Internet all of these technologies fall apart, right? That would be the case if the Internet was a centralized thing that the State could disable. But the Internet is simply the largest network of interconnected networks. Even if the State shutdown every Internet Service Provide (ISP) in the world and cut all of undersea cables, the separated networks will merely have to be reconnected. That is where a technology like mesh networking could come into play. Guifi.net, for example, is a massive mesh network that spans Catalonia. According to the website, there are currently 33,191 operating nodes in the Guifi.net mesh. Shutting down that many nodes isn’t feasible, especially when they can be quickly replaced since individual nodes are usually cheap off-the-shelf Wi-Fi access points. Without the centralized Internet a span of interconnected mesh networks could reestablish global communications and there isn’t much the State could do about it.

Statism has waxed and waned throughout human history. I believe we’re at a tipping point where statism is beginning to wane and I believe advances in individual empowering technologies are what’s diminishing it. Voting won’t hinder the State. The Libertarian Party won’t hinder the State. Crypto-anarchists, on the other hand, have a proven track record of hindering the State and all signs point to them continuing to do so.

Keybase Client

Keybase.io started off as a service people could use to prove their identity using Pretty Good Privacy (PGP). I use it to prove that I own various public accounts online as well as this domain. Back in February the Keybase team announced a chat client. I hadn’t gotten around to playing with it until very recently but I’ve been impressed enough by it that I feel the need to post about it.

Keybase’s chat service has a lot of similarities to Signal. Both services provided end-to-end encrypted communications, although in slightly different ways (Keybase, for example, doesn’t utilize forward secrecy except on “self-destructing” messages). However, one issue with Signal is that it relies on your phone number. If you want to chat on Signal with somebody you have to give them your phone number and they have to give you theirs. This reliance on phone numbers makes Signal undesirable in many cases (such as communicating with people you know online but not offline).

Keybase relies on your proven online identities. If you want to securely talk to me using Keybase you can search for me by using the URL for this website since I’ve proven my ownership of it on Keybase. Likewise, if you want to securely talk to somebody on Reddit or Github you can search for their user names on those sites in Keybase.

Another nice feature Keybase offers is a way to securely share files. Each user of the Keybase client gets 10GB of storage for free. Any data added to your private folder is encrypted in such a way that only you can access the files. If you want to share files amongst a few friends the files can be encrypted in a way that only you and those designated friends can access them.

On the other hand, if you’re into voice and video calls, you’re out of luck. Keybase, unlike Signal, currently supports neither and I have no idea if there are plans to implement them in the future. I feel that it’s also important to note that Keybase, due to how new it is, hasn’t undergone the same level of rigorous testing as Signal has so you probably don’t want to put the same level of trust in it yet.

Private Solutions to Government Created Problems

Earlier this week the United States Congress decided to repeal privacy protection laws that it had previous put into place on Internet Service Providers (ISP). While a lot of people have been wasting their time begging their representatives masters with phone calls, e-mails, and petitions, private companies have begun announcing methods to actually protect their users’ privacy. In the latest example of this, Pornhub announced that it will turn on HTTPS across its entire site:

On April 4, both Pornhub and its sister site, YouPorn, will turn on HTTPS by default across the entirety of both sites. By doing so, they’ll make not just adult online entertainment more secure, but a sizable chunk of the internet itself.

The Pornhub announcement comes at an auspicious time. Congress this week affirmed the power of cable providers to sell user data, while as of a few weeks ago more than half the web had officially embraced HTTPS. Encryption doesn’t solve your ISP woes altogether—they’ll still know that you were on Pornhub—but it does make it much harder to know what exactly you’re looking at on there.

As the article points out, your ISP will still be able to tell that you accessed Pornhub, since Domain Name Server (DNS) lookups are generally not secured, but it won’t be able to see what content you’re accessing. As for DNS lookups, solutions are already being worked on to improve their security. Projects like DNSCrypt, which provides encrypted DNS lookups, are already available.

If you want to protect your privacy you can’t rely on the State’s regulations. First, the State is the worst offender when it comes to surveillance and the consequences of its surveillance are far worse. Sure, your ISP might sell some of your data but the State will send men with guns to your home to kidnap you and probably shoot your dog. Second, as this situation perfectly illustrates, government regulations are temporary. The government implemented the privacy regulations and then took them away. It may restore them again in the future but there’s no guarantee it won’t repeal them again. Any government solution is temporary at best.

Cryptography offers a permanent solution that can protect Internet users from both their snoopy ISP and government. HTTPS and DNSCrypt will continue to work regardless of the state of privacy regulations.

Secure E-Mail is an Impossibility

A while back I wrote a handful of introductory guides on using Pretty Good Privacy (PGP) to encrypt the content of your e-mails. They were well intentioned guides. After all, everybody uses e-mail so we might as well try to secure it as much as possible, right? What I didn’t stop to consider was the fact that PGP is a dead end technology for securing e-mails not because the initial learning curve is steep but because the very implementation itself is flawed.

I recently came across a blog post by Filippo Valsorda that sums up the biggest issue with PGP:

But the real issues I realized are more subtle. I never felt confident in the security of my long term keys. The more time passed, the more I would feel uneasy about any specific key. Yubikeys would get exposed to hotel rooms. Offline keys would sit in a far away drawer or safe. Vulnerabilities would be announced. USB devices would get plugged in.

A long term key is as secure as the minimum common denominator of your security practices over its lifetime. It’s the weak link.

Worse, long term keys patterns like collecting signatures and printing fingerprints on business cards discourage practices that would otherwise be obvious hygiene: rotating keys often, having different keys for different devices, compartmentalization. It actually encourages expanding the attack surface by making backups of the key.

PGP, in fact the entire web of trust model, assumes that your private key will be more or less permanent. This assumption leads to a lot of implementation issues. What happens if you lose your private key? If you have an effective backup system you may laugh at this concern but lost private keys are the most common issue I’ve seen PGP users run into. When you lose your key you have to generate a new one and distribute it to everybody you communicate with. In addition to that, you also have to resign people’s existing keys. But worst of all, without your private key you can’t even revoke the corresponding published public key.

Another issue is that you cannot control the security practices of other PGP users. What happens when somebody who signed your key has their private key compromised? Their signature, which is used by others to decide whether or not to trust you, becomes meaningless because their private key is no longer confidential. Do you trust the security practices of your friends enough to make your own security practices reliant on them? I sure don’t.

PGP was a jury rigged solution to provide some security for e-mail. Because of that it has many limitations. For starters, while PGP can be used to encrypt the contents of a message it cannot encrypt the e-mail headers or the subject line. That means anybody snooping on the e-mail knows who the parties communicating are, what the subject is, and any other information stored in the headers. As we’ve learned from Edward Snowden’s leaks, metadata is very valuable. E-mail was never designed to be a secure means of communicating and can never be made secure. The only viable solution for secure communications is to find an alternative to e-mail.

With that said, PGP itself isn’t a bad technology. It’s still useful for signing binary packages, encrypting files for transferring between parties, and other similar tasks. But for e-mail it’s at best a bandage to a bigger problem and at worst a false sense of security.

A Beginner’s Guide to Privacy and Security

I’m always on the lookout for good guides on privacy and security for beginner’s. Ars Technica posted an excellent beginner’s guide yesterday. It covers the basics; such as installing operating system and browser updates, enabling two-factor authentication, and using a password manager to enable you to use strong and unique passwords for your accounts; that even less computer savvy users can follow to improve their security.

If you’re not sure where to begin when it comes to security and privacy take a look at Ars’ guide.

The Signal Desktop App Now Works with iOS

The developers behind Signal, an application that allows you to send secure text messaging and make secure phone calls, released a Chrome app some time ago. The Chrome app allowed you to link your Android device with the app so you could use Signal on a desktop or laptop computer. iOS users were left out in the cold, which annoyed me because I spend more time on my laptop than on my phone (also, because I hate typing on my phone). Fortunately, Signal for iOS now supports linking with the Chrome app.

It’s simple to setup and works well. If you, like me, don’t use Chrome as your primary browser and don’t want to open it just to use Signal you can right-click on the Signal App in Chrome and create a shortcut. On macOS the shortcut will be created in your ~/Applications/Chrome Apps/ folder (I have no idea where it puts it on Windows or Linux). Once created you can drag the Signal shortcut to the dock.

The Bill Of Rights Won’t Save You

You really need to use full disk encryption on all of your electronic devices. Modern versions of OS X and Linux make it easy. Windows is a bit hit or miss as BitLocker tries its damnedest to share your key with Microsoft’s servers. iOS has included full disk encryption by default — so long as you set a password — since version 8 and Android also includes support for full disk encryption. Use these tools because the Bill of Rights won’t protect your data from government snoops:

The government can prosecute and imprison people for crimes based on evidence obtained from their computers—even evidence retained for years that was outside the scope of an original probable-cause search warrant, a US federal appeals court has said in a 100-page opinion paired with a blistering dissent.

The 2nd US Circuit Court of Appeals ruled that there was no constitutional violation because the authorities acted in good faith when they initially obtained a search warrant, held on to the files for years, and built a case unrelated to the original search.

The case posed a vexing question—how long may the authorities keep somebody’s computer files that were obtained during a search but were not germane to that search? The convicted accountant said that only the computer files pertaining to his client—who was being investigated as part of an Army overbilling scandal—should have been retained by the government during a 2003 search. All of his personal files, which eventually led to his own tax-evasion conviction, should have been purged, he argued.

From my layman’s understanding of the Fourth Amendment, it’s supposed to protect against government shenanigans such as snooping through your data that was obtained under a valid warrant but was unrelated to the case the warrant was issued for to build another case against you. Although the quote is most likely false, Mr. Bush supposedly said, “It’s just a goddamned piece of paper!” in regards to the Constitution. While the quote is probably false the statement is not.

The Constitution cannot protect you. It is literally a piece of paper with words written on it. If you want some semblance of protection against the State you have to implement it yourself. Encrypting your devices’ storage would guard against this kind of nonsense assuming you weren’t foolish enough to decrypt the data for the State at any point. This is where features such as VeraCrypt’s (a fork of TrueCrypt that is being actively developed) hidden partition feature are nice because you can have a sanitized encrypted partition that you can decrypt and a hidden partition with your sensitive data. Since the hidden partition isn’t detectable the State’s agents cannot know whether or not it exists and therefore cannot compel you to decrypt it.

Utilize the tools available to you to protect yourself. Anybody who has been paying attention to recent American history knows that the supposed legal protections we all enjoy are little more than fiction at this point.

An Encrypted Society Is A Polite Society

Playing off of my post from earlier today, I feel that it’s time to update Heinlein’s famous phrase. Not only is an armed society a polite society but an encrypted society is a polite society.

This article in Vice discusses the importance of encryption to the Lesbian, Gay, Bisexual, and Transgender (LGBT) communities but it’s equally applicable to any oppressed segment of a society:

Despite advances over the last few decades, LGBTQ people, particularly transgender folks and people of color, face alarming rates of targeted violence, housing and job discrimination, school and workplace bullying, and mistreatment by law enforcement. In the majority of US states, for example, you can still be legally fired just for being gay.

So while anyone would be terrified about the thought of their phone in the hands of an abusive authority figure or a jealous ex-lover, the potential consequences of a data breach for many LGBTQ people could be far more severe.

[…]

LGBTQ people around the world depend on encryption every day to stay alive and to protect themselves from violence and discrimination, relying on the basic security features of their phones to prevent online bullies, stalkers, and others from prying into their personal lives and using their sexuality or gender identity against them.

In areas where being openly queer is dangerous, queer and trans people would be forced into near complete isolation without the ability to connect safely through apps, online forums, and other venues that are only kept safe and private by encryption technology.

These situations are not just theoretical. Terrifying real life examples abound, like the teacher who was targeted by for being gay, and later fired, after his Dropbox account was hacked and a sex video was posted on his school’s website. Or the time a Russian gay dating app was breached, likely by the government, and tens of thousands of users received a message threatening them with arrest under the country’s anti-gay “propaganda” laws.

Systematic oppression requires information. In order to oppress a segment of the population an oppressor must be able to identify members of that segment. A good, albeit terrifying, example of this fact is Nazi Germany. The Nazis actually made heavy use of IBM counting machines to identify and track individuals it declared undesirable.

Today pervasive surveillance is used by state and non-state oppressors to identify those they wish to oppress. Pervasive surveillance is made possible by a the lack of the use of effective encryption. Encryption allows individuals to maintain the integrity and confidentiality of information and can be used to anonymize information as well.

For example, without encryption it’s trivial for the State to identify transgender individuals. A simple unencrypted text message, e-mail, or Facebook message containing information that identifies an individual a transgender can either be read by an automated surveillance system or acquired through a court order. Once identified an agent or agents can be tasked with keeping tabs on that individual and wait for them to perform an act that justified law enforcement involvement. Say, for example, violating North Carolina’s idiotic bathroom law. After the violation occurs the law enforcement agents can be sent in to kidnap the individual so they can be made an example of, which would serve to send a message of terror to other transgender individuals.

When data is properly encrypted the effectiveness of surveillance is greatly diminished. That prevents oppressors from identifying targets, which prevents the oppressors from initiating interactions entirely. Manners are good when one may have to back up his acts with his life. Manners are better when one doesn’t have to enter into conflict in the first place.

Apple Gives The Feds Another Middle Finger


Me right now.

A lot of people are claiming Apple’s decision to fight the Federal Bureau of Investigations (FBI) is nothing more than a marketing play. But I swear that I can hear Tim Cook yelling, “Fuck the police!” because his company keeps making announcements that it’s going to make its products more secure:

WASHINGTON — Apple engineers have begun developing new security measures that would make it impossible for the government to break into a locked iPhone using methods similar to those now at the center of a court fight in California, according to people close to the company and security experts.

[…]

The company first raised the prospect of a security update last week in a phone call with reporters, who asked why the company would allow firmware — the software at the heart of the iPhone — to be modified without requiring a user password.

One senior executive, speaking on the condition of anonymity, replied that it was safe to bet that security would continue to improve. Separately, a person close to the company, who also spoke on the condition of anonymity, confirmed this week that Apple engineers had begun work on a solution even before the San Bernardino attack. A company spokeswoman declined to comment on what she called rumors and speculation.

Independent experts say they have held informal conversations with Apple engineers over the last week about the vulnerability. Exactly how Apple will address the issue is unclear. Security experts who have been studying Apple’s phone security say it is technically possible to fix.

In addition to senior executives talking about upcoming security enhancements, Apple has also added an interesting figure to its payroll:

Frederic Jacobs, for those who don’t know, was one of the developer of the iOS version of Signal, the secure messaging application created by Open Whisper Systems that I highly recommend.

It seems to me that Apple is doing more than marketing here. The company seems dedicated to offering a secure product to its customers. My greatest hope is that this encourages other companies to follow suit.