Political Solutions Versus Technical Solutions

When discussing pervasive surveillance I focus exclusively on technical solutions. People involved in political activism often ask me why I don’t also involve myself in political solutions. My reason is that I don’t like investing effort into worth that is unlikely to pay off when I can invest it in work that will pay off.

Consider the political solution. Say, in spite of everything we know about the state, Congress decides to ban the National Security Agency (NSA) from spying on American citizens and actually enforces that ban. What then? You’re still vulnerable to spying from the Government Communications Headquarters (GCHQ) as well as the intelligence agency of every other major world government. In addition to that your Internet service provider (ISP) can still spy on you and inject malicious code into websites you visit. Political solutions are also temporary. Once the Congress that voted to prohibit the NSA from spying is replaced with a new Congress that ban could be reversed.

Technical solutions avoid those limitations. When you use security forms of communication that the NSA, GCHQ, and other intelligence agencies can’t crack then they are unable to spy on regardless of where the political winds blow. Furthermore ISPs are unable to surveil your traffic or inject malicious code into websites you visit. Technical solutions fix the holes needed to spy on you and therefore defends you against all surveillance and not only for temporary stretches of time (assuming the secure communication tools continue to be maintained so any discovered vulnerabilities are fixed).

I, like everybody else, only have a limited amount of time. Why would I invest some of that precious time into something that is, at best, temporary and only guards against a select few bad actors when I can focus on something that is more permanent and works against all bad actors? It just doesn’t make sense.

Another Vulnerability Caused by State Meddling

In March a security vulnerability, given the fancy marketing name FREAK, was discovered. FREAK was notable because it was caused by government meddling in computer security. Due to cryptography export restrictions quality cryptographic algorithms were not allowed to be put into widespread use, at least legally, and many legacy systems were built around weak algorithms. FREAK may be behind us but a new vulnerability was just discovered:

Tens of thousands of HTTPS-protected websites, mail servers, and other widely used Internet services are vulnerable to a new attack that lets eavesdroppers read and modify data passing through encrypted connections, a team of computer scientists has found.

The vulnerability affects an estimated 8.4 percent of the top one million websites and a slightly bigger percentage of mail servers populating the IPv4 address space, the researchers said. The threat stems from a flaw in the transport layer security protocol that websites and mail servers use to establish encrypted connections with end users. The new attack, which its creators have dubbed Logjam, can be exploited against a subset of servers that support the widely used Diffie-Hellman key exchange, which allows two parties that have never met before to negotiate a secret key even though they’re communicating over an unsecured, public channel.

The weakness is the result of export restrictions the US government mandated in the 1990s on US developers who wanted their software to be used abroad. The regime was established by the Clinton administration so the FBI and other agencies could break the encryption used by foreign entities. Attackers with the ability to monitor the connection between an end user and a Diffie-Hellman-enabled server that supports the export cipher can inject a special payload into the traffic that downgrades encrypted connections to use extremely weak 512-bit key material. Using precomputed data prepared ahead of time, the attackers can then deduce the encryption key negotiated between the two parties.

We’ll likely be dealing with the consequences of those export restrictions for some time to come. The only upside to this is that it is a reminder of what happens when the government meddles in security for its own purposes. Cryptography export restrictions were put in place because the United States government feared it would be unable to spy on foreign entities (and, as it turns out, domestic entities). Now the government, operating under similar concerns for its ability to spy, is discussing mandating the inclusion of back doors in systems that use strong cryptography. If this happens and developers actually comply we’ll have a repeat of what we’re dealing with today. Security vulnerabilities will arise from government mandated cryptography weaknesses that will put the masses at risk.

Whenever the government wishes the involve itself in something that only appropriate answer for the people to give is a loud “No!” This is especially true when it comes to security because the government has a direct interest in ensuring that each and every one of us is vulnerable to its surveillance apparatus.

Why Political Activism Won’t Stop Mass Surveillance

Time and again people ask me why I don’t involve myself in political activism to stop mass surveillance. My answer is doing so is pointless because no matter how hard you beg the state it will never handicap itself. Case in point, the Uniting and Strengthening America by Fulfilling Rights and Ending Eavesdropping, Dragnet-collection and Online Monitoring (USA FREEDOM) Act (I hope a staffer was paid a nice bonus for coming up with that acronym). It has been hailed as a solution to the National Security Agency’s (NSA) mass surveillance practices. However the bill, as so often is the case, does the opposite of what its name implies and advocates claim. Instead of curtailing NSA surveillance the bill codifies it:

After only one hour of floor debate, and no allowed amendments, the House of Representatives today passed legislation that seeks to address the NSA’s controversial surveillance of American communications. However, opponents believe it may give brand new authorization to the U.S. government to conduct domestic dragnets.

[…]

However, the legislation may not end bulk surveillance and in fact could codify the ability of the government to conduct dragnet data collection.

“We’re taking something that was not permitted under regular section 215 … and now we’re creating a whole apparatus to provide for it,” Rep. Justin Amash, R-Mich., said on Tuesday night during a House Rules Committee proceeding.

“The language does limit the amount of bulk collection, it doesn’t end bulk collection,” Rep. Amash said, arguing that the problematic “specific selection term” allows for “very large data collection, potentially in the hundreds of thousands of people, maybe even millions.”

In a statement posted to Facebook ahead of the vote, Rep. Amash said the legislation “falls woefully short of reining in the mass collection of Americans’ data, and it takes us a step in the wrong direction by specifically authorizing such collection in violation of the Fourth Amendment to the Constitution.”

Political activism can’t solve problems. At most is can be used to convince the state to rewrite its rules, and then only temporarily, so that it can continue doing the same thing but claim it isn’t. The only way widespread surveillance can be curtailed is if every one of us begins encrypting all of our communications. Even if some of us utilize weak cryptography it will still increase the overall cost of operating the system. Clear text requires no resources to read. Weak cryptography still requires some resources to identify the algorithm(s) used and to reverse them. Furthermore the text of any encrypted communication is unknown to the eavesdropper until it’s unencrypted. Strong cryptographic tools, on the other hand, are practically (as in the time required is longer than the information’s usefulness) impossible for spies to crack.

Stop begging the state to neuter its spying capabilities and take back your privacy. A good place to start is to begin utilizing tools that allow secure communications.

Deprecating Non-Secure HTTP

One of the biggest weaknesses of the Internet, in my opinion, is the fact secure connections aren’t the default. E-mail servers often don’t transmit messages to other e-mail server over secure connections. Many Jabber servers don’t utilize secure connections to other servers they’re federated with. Even the protocol most of us deal with multiple times on a daily basis to interact with web servers, the hypertext transport protocol (HTTP), isn’t secure by default. This lack of security has been a boon for national spy agencies such as the National Security Agency (NSA) and the Government Communications Headquarters (GCHQ). Even private businesses have been exploiting the lack of secure HTTP connections so they can better spy on their customers for advertising purposes. At this point it’s clear that non-secure Internet connections need to die.

To this end Mozilla, the developer of Firefox, has announced its plan to depricate non-secure HTTP:

Today we are announcing our intent to phase out non-secure HTTP.

There’s pretty broad agreement that HTTPS is the way forward for the web.  In recent months, there have been statements from IETF, IAB (even the other IAB), W3C, and the US Government calling for universal use of encryption by Internet applications, which in the case of the web means HTTPS.

After a robust discussion on our community mailing list, Mozilla is committing to focus new development efforts on the secure web, and start removing capabilities from the non-secure web.

This could be a huge move in the right direction. If every major browser deprecated non-secure HTTP it would force web servers to make secure connections available by default or lose users. More importantly, in my opinion, is that getting rid of non-secure HTTP would also eliminate the what’s encrypted guessing game. Many websites only utilize a secure connection for specific actions such as logging into an account or sending credit card data. Other interactions with the web server are done over a non-secure connection. That guessing game can make users believe that they’re connection is secure even though it isn’t.

Deprecating non-secure HTTP isn’t a straight forward move. Enabling transport layer security (TLS) isn’t as simple as flipping a switch. You need to obtain a keypair signed by an authority that major browsers trust, load them on the web server, and ensure those keys aren’t compromised. Administrators also have to keep up on recent security news so they can reconfigure their server when new exploits are discovered. Managing certificates could become much easier if Let’s Encrypt gains traction. Ensuring broken TLS protocols and features aren’t being used is a more difficult task but one that will likely be made easier as more sites move towards TLS. With that said, deprecating non-secure HTTP must be done regardless of the challenges involved.

CryptoParty in Minneapolis on May 9th

Do you want to learn how to communicate securely but don’t want to spend any money? Join CryptoPartyMN at The Hack Factory this Saturday between 13:00 and 17:00. We’ll teach you how to secure your stuff and won’t even hit you up for loose change!

This event will serve as a dry run before our main CryptoParty at Security B-Sides MSP on June 13th and 14th. Some mistakes will likely be made but I think we’ll be able to help you secure your life with a decent amount of competency.

If you’re interested in attending please RSVP here.

You Can’t Stop the Signal

Two days ago a giant bust of Edward Snowden was found perched atop the Prison Ship Martyrs’ Monument in Fort Greene Park. The sculpture was of the best sort, illicit. It didn’t take long for the authorities to coverup and then remove Snowden from the park, which sent a more prominent message than anything else they could have done. But the signal can’t be stopped. Yesterday a different group of artists created a hologram of Snowden at the site of the previous sculpture:

NEW YORK — Hours after police removed an illicit bust of Edward Snowden from its perch in a Brooklyn park on Monday, artists replaced it with a hologram.

The group of artists — who collectively call themselves “The Illuminator” and are not related to the trio behind the original sculpture — used laptops and projection equipment to cast an image of Snowden in a haze of smoke at the spot where the sculpture once stood.

They say the action was a message of defiance aimed at the authorities who “censored” the piece, according to a tumblr post.

I believe if anybody is deserving of a monument it’s Snowden. He belongs to that rare breed of people willing to risk it all to bring our rulers’ dirty laundry to light. Someday I hope a monument of him and Chelsea Manning are erected in dedication to the idea that breaking the law is sometimes the most heroic thing one can do.

Yet Another Reason to Use HTTPS On Your Site

Transport Layer Security (TLS), often referred to by its predecessor SSL, helps protect the privacy of your users and prevents malicious actors from altering the content being sent between them and your servers. Since it’s such a powerful tool you should think every site would enable it by default but they don’t. If the privacy of your users and the integrity of your data isn’t enough to convince you to enable TLS maybe this will:

With CloudFlare, websites can afford extra security to users with Full SSL (Strict) encryption. Long story short, this strips certain identifiers from the traffic data ISPs use to block websites like TPB; since the information is routed through CloudFlare, website IP addresses are also hidden behind the delivery network. In the UK, where all major ISPs were strong-armed into blocking TPB in 2012, this has all but turned back time, with thepiratebay.se now accessible for Virgin, EE, BT and TalkTalk customers. Sky is the only popular provider still managing to block the site; you aren’t notified, as such, but the page won’t load anyhow.

TLS makes blocking access to websites more difficult (although not entirely impossible). Many web filters rely on identifiable information viewable in plaintext streams. When you encrypt those streams with TLS those filters are no longer able to see the identifiable information and therefore can’t block access.

Avoiding censorship is just another reason why you should not only enable TLS on your site but make its use mandatory by disabling unsecured connections (or redirecting them to secured connections as I do with this blog).

Today’s Browser Vulnerability is Brought to You By the State and the Letters F, R, E, A, and K

People often mock libertarians by claiming they blame everything on the state. But the recently revealed Factoring Attack on RSA-EXPORT Keys (FREAK) that leaves Android and Apple users vulnerable was actually the fault of the state. How so? Because of its futile attempts in the 1990s to control the export of strong encryption technology:

The weak 512-bit keys are a vestige of the 1990s, when the Clinton administration required weak keys to be used in any software or hardware that was exported out of the US. To satisfy the requirement, many manufacturers designed products that offered commercial-grade keys when used in the US and export-grade keys when used elsewhere. Many engineers abandoned the regimen once the export restrictions were dropped, but somehow the ciphers have managed to live on a select but significant number of end-user devices and servers. A list of vulnerable websites is here. Matthew Green, an encryption expert at Johns Hopkins University, told Ars the vulnerable devices included virtually all Android devices, as well as iPhones and Macs.

This is yet another example of how state regulations make us all vulnerable. In the state’s lust to control everything it often puts regulations in place that prevent its subject from utilizing the best available defensive technologies. From restrictions on encryption technology to body armor the state’s vested interest in spying on your and killing you far outweighs whatever concerns it may have about your safety.

We’re in the midst of a second crypto war but the state isn’t using its failed regulatory red tape this time. Instead it is trying to convince companies to implement back doors, actively exploiting encryption technology without disclosing the vulnerabilities to developers, and surveilling whatever data connections it can get its taps into. Even though the strategy has change the end goal remains the same; leave the people vulnerable to malicious actors so the state can ensure its capability to spy on us and kill us remain intact.

Google Backs Away from Encrypting Android 5.0 Device By Default

When Snowden leaked the National Security Agency’s (NSA) dirty laundry a lot of companies’ faces were red. The leaks showed that they were either complacent in the NSA’s surveillance apparatus or helpless to stop the agency from exploiting their systems. In an attempt to rebuild customer confidence many technology companies scrambled to improve the security on their devices. Apple, being the manufacturer of very popular handsets, announced several major security improvements in iOS 8, including disabling its ability to bypass a user’s set passcode. Much to the approval of Android users Google announced that Android 5.0, also known as Lollipop, would ship with device encryption enabled by default.

But some bad news appeared yesterday. Google has backed down from enabling encryption by default in Lollipop:

Last year, Google made headlines when it revealed that its next version of Android would require full-disk encryption on all new phones. Older versions of Android had supported optional disk encryption, but Android 5.0 Lollipop would make it a standard feature.

But we’re starting to see new Lollipop phones from Google’s partners, and they aren’t encrypted by default, contradicting Google’s previous statements. At some point between the original announcement in September of 2014 and the publication of the Android 5.0 hardware requirements in January of 2015, Google apparently decided to relax the requirement, pushing it off to some future version of Android. Here’s the timeline of events.

This, in my seldom humble opinion, is a very bad idea. The justification appears to be performance related. Namely the performance of many Android devices without hardware cryptography acceleration support tend to take a huge performance dive when device encryption is enabled.

If a user wants to disable device encryption that’s their choice but I firmly believe that this option should be enabled by default even if performance noticeably suffers on some devices. We’ve seen too many stories where abusive spouse, police officers, and federal agents have retrieved data from unencrypted devices without the consent of the owner or, in the case of law enforcement, warrants. With the amount of personal data people store on their mobile devices it’s far too risky to leave that data unprotected from prying eyes. Especially when we live in a surveillance state.