The Founding Fathers Did Use Encryption

One of the arguments that have been made for prohibiting strong encryption is that the Founding Fathers couldn’t have envisioned a world where law enforcers were unable to read communications. Why the Founding Fathers needed to be clairvoyant to justify something today is beyond me but the Electronic Frontier Foundation (EFF) had a great rebuttal to the argument. If you head over to the Library Of Congress’s website you can read about how James Madison encrypted his messages to prevent law enforcers from reading them:

As a Virginia delegate to the Continental Congress, while secretary of state, and in his personal correspondence with Thomas Jefferson, James Madison feared constantly that unauthorized people would seek to read his private and public correspondence. To deter such intrusions, he resorted to a variety of codes and ciphers.

Most of the early ciphers that Madison used were keyword polyalphabetic code systems involving a complex interaction of a keyword with alphabets and numbers in a preestablished pattern. The codes were designed by James Lovell, a Massachusetts delegate to the Continental Congress and an expert on ciphers. On July 5, 1782, Edmund Randolph wrote to James Madison: “I wish, that on future occasions of speaking of individuals we may use the cypher, which we were taught by Mr. Lovell. Let the keyword be the name of the negro boy, who used to wait on our common friend.” Madison noted at the bottom of Randolph’s letter, “Probably CUPID.” He added, “I have been in some pain from the danger incident to the cypher we now use. The enemy I am told have in some instances published their intercepted cyphers.”

What’s interesting here is that Madison not only encrypted his messages when he was in the Continental Congress but also after he became secretary of state and in his personal correspondences. He wasn’t just hiding his communications from British law enforcers but continued to hide them even after they had been replaced by United States law enforcers. That only makes sense because if you only encrypt important messages the simple fact you used encryption indicates to spies that the message is important and resources should be put into decrypting it.

Arguing that the Founding Fathers couldn’t have predicted pervasive encryption is idiotic because they themselves used it. There’s also no evidence that they provided either British or United States law enforcers with any keys to allow them to rapidly decrypt the communications if needed.

CryptoPartyMN Website is Up Again

You probably noticed that posting has been sparse this week. That’s because I’ve been focusing my efforts on setting up the new website for CryptoPartyMN. For those of you who haven’t heard of CryptoPartyMN, it’s a group of us in the Twin Cities region that are organizing periodic meetups with the intention of teaching people who to utilize string crypto to protect online anonymity and security communications. We hosted a CryptoParty at The Hack Factory on May 9th and B-Sides MSP and are planning more in the future.

Admittedly the website is pretty bland right now. Unfortunately the theme we were using was on the other server that I don’t have access to. It’ll be improved in time. Likewise now that the site is up and will remain up regularly we’ll make sure to post meetup notifications on it (we usually meet every other Tuesday). Add it to your RSS feed if you want to know when the next CryptoParty event is.

David Cameron Is On A Holy Crusade To End Encryption

When Edward Snowden showed the world that the United States and British governments were spying on the entire world, including their own citizens, a lot of people were pissed. Citizens of those countries were pissed because their governments had promised them for decades that they weren’t going to spy on them. Other countries, especially those who were allied with the United States and Britain, were pissed for the same reason. Both the United States and British governments were pissed because lots of people suddenly started encrypting the lines of communication that were being spied upon.

In addition to becoming pissed off the people being spied on decided to start making more thorough use of encryption. Seeing this and noting how it could hurt their spying efforts the two government responsible for this entire mess have been working diligently on making those who have begun using strong encryption criminals. David Cameron, a British politician, has been beating on the criminalizing encryption drum especially hard:

David Cameron has signalled that he intends to ban strong encryption — putting the British government on a collision course with some of the biggest tech companies in the world.

As reported by Politics.co.uk, the British Prime Minister reaffirmed his commitment to tackling strong encryption products in Parliament on Monday in response to a question.

Crypto Wars II is moving into full swing. What I really enjoy about Mr. Cameron’s crusade is how blatantly it demonstrates the true goals of the British state. Like all states the British state claims to protect the person, property, and rights of the people within its borders. However banning strong encryption would violate every British citizens’ person, property, and rights.

By not having access to strong encryption users of the Internet are directly at risk of many threats. The first threat is that their personal information is up for grabs by anybody who has the knowledge to bypass weak crypto systems. That means, for example, abused spouses could have their efforts to contact help discovered and thwarted.

Property is also at great risk if strong crypto isn’t available. If you think the leaking of credit card data is bad now just imagine what it would be like if anybody snooping communications between a client and server could break the crypto and nab the card data. Business deals would also be at risk because anybody snooping communications between two businesses could see what deals were being worked on and maneuver to hamper those deals.

Weak crypto systems also put peoples’ rights at risk. Due process could go entirely out the window if law enforcement officers are able to extend their “anything you say can and will be used against you” to snooping on every citizen at all hours of the day. On a personal level you also put the right of privacy at risk Embarrassing communications, such as those between a doctor and their patient could suddenly find themselves posted on public forums.

There is an upside to all of this. What Mr. Cameron proposes is a pipe dream. Prohibiting strong crypto is impossible because it is nothing more than math and math, being in the realm of ideas, cannot be stopped from spreading. With the widespread use of the Internet we’ve seen how impossible censorship has become and that isn’t going to change.

VPN Isn’t A Magic Bullet

I really like virtual private networks (VPN) and a lot of people utilize them for various reasons including protecting anonymity, thwarting region locks on services, and bypassing filters put in place by Internet service providers (ISP). However it’s important to note that there are no magic bullets and VPN is not exception.

We’re in the midst of a transition from IPv4 to IPv6. A lot of software still either doesn’t support or isn’t properly configured to handle IPv6 yet. In fact my ISP, Comcast, still doesn’t give business customers IPv6 addresses so I can’t setup my services to properly work with the new fangled Internet addressing scheme (and Comcast happens to be the only option in my area, good thing for Comcast the government exists to protect monopolies). That means my VPN server, like many others, may very well leak personal information through IPv6:

The study of fourteen popular VPN providers found that eleven of them leaked information about the user because of a vulnerability known as ‘IPv6 leakage’. The leaked information ranged from the websites a user is accessing to the actual content of user communications, for example comments being posted on forums. Interactions with websites running HTTPS encryption, which includes financial transactions, were not leaked.

The leakage occurs because network operators are increasingly deploying a new version of the protocol used to run the Internet called IPv6. IPv6 replaces the previous IPv4, but many VPNs only protect user’s IPv4 traffic. The researchers tested their ideas by choosing fourteen of the most famous VPN providers and connecting various devices to a WiFi access point which was designed to mimic the attacks hackers might use.

This is why I recommend doing things that absolutely need to remain private through a dedicated anonymity tool such as the Tor Browser. VPNs aren’t great for preserving anonymity anyways since the server administrator knows the IP address of connect clients whereas Tor exit nodes only know the IP address of the relays directly connected to it. The Tor developers also focus on anonymity first, which means they’re far more likely to find and fix leaks that could reveal personally identifiable information. However VPNs still work well for establishing connections to remote networks in a secure manner and will still do a good job of bypassing filters and region locks.

It’s also worth nothing that as we continue to transition to IPv6 we’re going to keep running into issues like this. Change is never completely smooth, especially when some ISPs, such as Comcast, still don’t provider customers the tools needed to utilize IPv6.

NSA Officially Allowed to Continue Spying Operation

Many people were too euphoric about the expiration of Section 215 of the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (the whole name of the act doesn’t get printed out enough, which is a shame because somebody spent a tremendous amount of time trying to think of a backronym for USA PATRIOT) Act to take a moment to consider what it really meant. I noted that the expiration didn’t actually change anything but governments love their redundancy so the Foreign Intelligence Surveillance Court ruled that the National Security Agency (NSA) could resume (implying it didn’t simply continue its surveillance program after the expiration) wholesale spying on American citizens:

WASHINGTON — The Foreign Intelligence Surveillance Court ruled late Monday that the National Security Agency may temporarily resume its once-secret program that systematically collects records of Americans’ domestic phone calls in bulk.

[…]

In a 26-page opinion made public on Tuesday, Judge Michael W. Mosman of the surveillance court rejected the challenge by FreedomWorks, which was represented by a former Virginia attorney general, Ken Cuccinelli, a Republican. And Judge Mosman said the Second Circuit was wrong, too.

“Second Circuit rulings are not binding” on the surveillance court, he wrote, “and this court respectfully disagrees with that court’s analysis, especially in view of the intervening enactment of the USA Freedom Act.”

When the Second Circuit issued its ruling that the program was illegal, it did not issue any injunction ordering the program halted, saying it would be prudent to see what Congress did as Section 215 neared its June 1 expiration. Jameel Jaffer, an A.C.L.U. lawyer, said on Tuesday that the group would now ask for one.

Once again I find it necessary to reiterate that politics isn’t going to solve this problem. The government enjoys the ability to spy on the populace too much to give it up. No amount of begging, voting, or completely pointless filibustering by presidential hopefuls who don’t have a chance in Hell of winning the nomination is going make the NSA’s surveillance apparatus go away.

If you actually oppose this kind of spying then it is up to you to do something about it. Standing by and hoping you can vote somebody into office to deal with the problem for you isn’t going to cut it. You need to learn, encrypt, and decentralized.

The NSA’s program relies on the pervasive use of plaintext communications and centralization. Collecting plaintext, which is a term for any unencrypted data including e-mails and phone calls, costs very little outside of the taps on the lines and storage. Encrypted text is an entirely different beast. When the NSA scoops up encrypted communications it doesn’t know what it has obtained unless it is able to break the encryption. The documents leaked by Snowden showed us that the NSA had problems with numerous encryption tools including Pretty Good Privacy (PGP) and Off-the-Record (OTR) messaging. Even when the NSA is able to break the encryption it’s not a costless endeavor when compared to plaintext.

Another key thing the NSA relies on is centralization. It’s much easier to surveil people when they’re all using a handful of services. With the popularity of Gmail, the fact that there are only four major cell phone carriers in the country, and how many people use Facebook a lot of data is being stored in a handful of locations, which means the NSA only needs to focus its efforts on a few key spots to spy on a vast majority of American. If more people ran their own e-mail, XMPP, etc. servers it would increase the NSA’s costs as it would have to spread out its efforts. Utilizing decentralized networks, such as Wi-Fi mesh networks, instead of centralized Internet Service Providers (ISP) would even further complicate the NSA’s efforts.

Fighting the NSA’s surveillance apparatus requires increasing the agency’s costs. That can only be done by the ubiquitous use of encryption and decentralizing infrastructure. Don’t be a lazy libertarian, start learning how to utilize cryptographic tools today. As always I’m here to help.

OpenVPN

After getting my business Internet account the first thing I did was setup a virtual private network (VPN) server. VPN servers have a million and one uses but the most important feature they offer me is the ability to have a secure tunnel when connected to networks that aren’t mine. I settled on L2TP/IPSec since that was the more secure of the two options offered by OS X Server (as you can tell the running theme with my network has been migrating away from OS X Server).

L2TP/IPSec served its purpose, it gave me a secure tunnel to my home network, but there were several notable downsides. The biggest of which was the way it was handled by iOS. iOS disconnects from an L2TP/IPSec VPN server when the device is turned off and doesn’t automatically reconnect when it is turned on again. That means I had to go into the settings and manually turn it on whenever I wanted to use it (which is often). I know, first world problems.

Last week I began setting up a replacement VPN server, this one using OpenVPN. This ended up being a phenomenal leap forward. OpenVPN uses OpenSSL for encryption and authentication. That gives you a lot of options. For my purposes I restricted my OpenVPN server to only use TLSv1.2 (the latest), forward secrecy, and known strong encryption and authentication algorithms. Instead of using a pre-shared key, which is an option, I’m using certificates. Using certificates offers several advantages but the most important one to me is that iOS will automatically reconnect to a VPN server if authentication is performed with certificates. OpenVPN has a great, albeit ugly as sin, client for iOS that can import OpenVPN profiles. Best of all the app doesn’t need to be running for the VPN connection to remain connected (so you don’t have to worry about the tunnel closing after 10 minutes since that’s the longest amount of time an app can run in the background on iOS). Now when I turn my phone on it automatically connects to my VPN server.

Since OpenVPN utilizes TLS it’s supposedly difficult to distinguish from HTTPS traffic, which means it’s less likely a network filter will block you from connecting to your VPN server. I don’t have access to a network that hostile so I can speak to the effectiveness of this but it’s something to keep in mind if you regularly find yourself connecting devices to a heavily filtered network.

If you’re interested in setting up a VPN server I highly recommend OpenVPN. It’s fairly simple to setup and clients are available for most operating systems.

Why Everybody Should Use Encryption

Using encryption requires individuals to put forth the effort to learn. Because people tend to be lazy they usually spend more time coming up with excuses for not learning encryption than they do learning how to use it. Ultimately the excuse they end up settling on is that they have nothing to hide. This is bullshit, of course. If they truly didn’t have anything to hide they would put Internet accessible cameras and microphones in every room of their house and allow anybody to check in on what they’re doing at any time. But they don’t.

Besides the fact that we all have something to hide there is another reason why the “nothing to hide” excuse doesn’t work. To quote Bruce Schneier:

Encryption should be enabled for everything by default, not a feature you turn on only if you’re doing something you consider worth protecting.

This is important. If we only use encryption when we’re working with important data, then encryption signals that data’s importance. If only dissidents use encryption in a country, that country’s authorities have an easy way of identifying them. But if everyone uses it all of the time, encryption ceases to be a signal. No one can distinguish simple chatting from deeply private conversation. The government can’t tell the dissidents from the rest of the population. Every time you use encryption, you’re protecting someone who needs to use it to stay alive.

By not using encryption you are putting lives in danger. Specifically the lives of people who need encryption to stay alive. So long as a majority of people utilize unencrypted forms of communication the presence of encryption becomes a signal that indicates to a snoop that the captured data is important. If all data, from e-mails wishing grandma a happy birthday to plans for protesting the latest act of police brutality, is encrypted then the spies can’t use it to indicate what is and isn’t important. At that point their costs skyrocket because the only way for them to learn what is and isn’t important is to decrypt everything, which isn’t feasible for any organization.

So stop making excuses and learn how to encrypt your data. There are plenty of people out there, including myself, willing to help you. If you don’t then you’re contributing to a problem that puts real lives in danger.

The Seedier Side of the Internet isn’t as Seedy as You Think

Due to the popularity of Silk Road the mainstream media has been busily reporting about the “dark” web. If you take the news stories about the “dark” web literally it is a place where child pornography is readily available, hitmen can be hired for a handful of Bitcoin, and terrorists commonly hold secret meetings to discuss their plan blow up the next elementary school. Reality, as is often the case with mainstream media portrayals, is quite different:

Read nearly any article about the dark web, and you’ll get the sense that its name connotes not just its secrecy but also the low-down dirty content of its shadowy realms. You’ll be told that it is home to several nefarious things: stolen data, terrorist sites, and child porn. Now while those things may be among what’s available on the dark web, all also are available on the normal web, and are easily accessible to anyone, right now, without the need for any fancy encryption software.

[…]

Despite reports, there are only shreds of evidence that the Islamic State is using the dark web. One apparent fund-raising site highlighted by the Washington Post had managed to garner exactly 0 bitcoins at the time of writing, and this was also the case with another I discovered recently. It’s worth pointing out that both of those sites simply claimed to be funneling the cash to the terrorist group, and could easily have been fakes. The one Islamic extremist dark web site to actually generate any revenue mustered only $1,200 earlier this year. Even it doesn’t explicitly mention the Islamic State.

And yes, child porn is accessible on the normal web. In fact, it is rampant when compared with what’s available from hidden sites. Last year, the Internet Watch Foundation, a charity that collates child sexual abuse websites and works with law enforcement and hosting providers to have the content removed, found 31,266 URLs that contained child porn images. Of those URLs, only 51 of them, or 0.2 percent, were hosted on the dark web.

In other words the big scary “dark” web is basically a smaller regular Internet. What you find on hidden sites, which is the correct term for the “dark” web, is also far more widely available on the regular Internet. Why do sites go through the hassle of requiring visitors to utilize something like the Tor browser then? Because maintaining anonymity for both themselves and their visitors is valuable.

In the case of Silk Road, for example, it was much easier to build user trust by using a hidden site since there was a barrier between the service and the identity of its users. Not only did that barrier protect users from potentially being revealed to law enforcement agents by the site’s administrators but it also prevented buyers and sellers from being able to identify each other. Silk Road was an example of anonymity making things safer for everybody involved.

If you’re of the opinion that buying and selling drugs should result in men with guns kicking down doors at oh dark thirty and therefore what I said above is not a valid justification for hidden sites don’t worry, I have another. Journalists often find themselves in positions where sources demand anonymity before revealing important information. That is why services such as Onionshare, were created:

That’s exactly the sort of ordeal Micah Lee, the staff technologist and resident crypto expert at Greenwald’s investigative news site The Intercept, hopes to render obsolete. On Tuesday he released Onionshare—simple, free software designed to let anyone send files securely and anonymously. After reading about Greenwald’s file transfer problem in Greenwald’s new book, Lee created the program as a way of sharing big data dumps via a direct channel encrypted and protected by the anonymity software Tor, making it far more difficult for eavesdroppers to determine who is sending what to whom.

Whistle blowers are an example of individuals who are less likely to talk to journalists, and therefore blow the whistle, unless their identify can be protected. This is especially true when the whistle blower is revealing unlawful government activities. With access to legal coercive powers it is possible for the state to compel a journalist to reveal a source of information damning to it. If the journalist doesn’t know the identity of the whistle blower, as would be the case if the data was sent via a hidden service, they cannot reveal it to the state no matter what court orders it issues or torture it performs. That protection makes the likelihood of a whistle blower to come forward much higher.

The “dark” web is little more than a layer of anonymity bolted onto the existing Internet. Anything available on the former is available in far larger quantities on the latter. What the “dark” web offers is protection for people often needing it. Like any tool it can be used for both good and bad but that doesn’t justify attempting to wipe it out. And because much of the world is ruled by even more insane states than the ones that dominate the so-called first world I would argue the good of protecting people far outweighs the bad that was happening and still is happening on the regular Internet.

History of Crypto War I

In its zeal to preserve the power to spy on its citizens members of the United States government have begun pushing to prohibit civilians from using strong cryptography. While proponents of this prohibition try to scare you with words such as terrorists, drug cartels, and pedophiles let’s take a moment to remember the last time this war was waged:

Encryption is a method by which two parties can communicate securely. Although it has been used for centuries by the military and intelligence communities to send sensitive messages, the debate over the public’s right to use encryption began after the discovery of “public key cryptography” in 1976. In a seminal paper on the subject, two researchers named Whitfield Diffie and Martin Hellman demonstrated how ordinary individuals and businesses could securely communicate data over modern communications networks, challenging the government’s longstanding domestic monopoly on the use of electronic ciphers and its ability to prevent encryption from spreading around the world. By the late 1970s, individuals within the U.S. government were already discussing how to solve the “problem” of the growing individual and commercial use of strong encryption. War was coming.

The act that truly launched the Crypto Wars was the White House’s introduction of the “Clipper Chip” in 1993. The Clipper Chip was a state-of-the-art microchip developed by government engineers which could be inserted into consumer hardware telephones, providing the public with strong cryptographic tools without sacrificing the ability of law enforcement and intelligence agencies to access unencrypted versions of those communications. The technology relied on a system of “key escrow,” in which a copy of each chip’s unique encryption key would be stored by the government. Although White House officials mobilized both political and technical allies in support of the proposal, it faced immediate backlash from technical experts, privacy advocates, and industry leaders, who were concerned about the security and economic impact of the technology in addition to obvious civil liberties concerns. As the battle wore on throughout 1993 and into 1994, leaders from across the political spectrum joined the fray, supported by a broad coalition that opposed the Clipper Chip. When computer scientist Matt Blaze discovered a flaw in the system in May 1994, it proved to be the final death blow: the Clipper Chip was dead.

The battlefield today reflects the battlefield of Crypto War I. Members of the government are again arguing that all civilian cryptography should be weakened by mandating the use of key escrow that allows the government to gain access to any device at any time. As with the last war, where the government proposed Clipper Chip was proven to be completely insecure, this war must be looked at through the eye of government security practices or, more specifically, lack of security practices. It was only last week that we learned some of the government’s networks are not secure, which lead to the leaking of every federal employee’s personal information. How long do you think it would take before a hack of a government network lead to the leaking of every escrow key? I’d imagine it would take less than a week. After that happened every device would be rendered entirely insecure by anybody who downloaded the leaked escrow keys.

What everybody should take away from this is that the government is willing to put each and every one of us at risk just so it can maintain the power to spy on use with impunity. But its failure to win Crypto War I proved that the world wouldn’t come to an end if the government couldn’t spy on us with impunity. Since Crypto War I the power of law enforcement agents to acquire evidence of wrongdoing (according to the state) didn’t suddenly stop, terrorist attacks didn’t suddenly become a nightly occurrence, and children being abducted by pedophiles didn’t suddenly become a fact of everyday life.

Crypto War II is likely inevitable but it can be won just as the last one was. The first step to victory is not allowing yourself to be suckered by government lies.