Signal for iOS Now Supports Secure Text Messaging

One of the things I try to do is find tools that enable secure communications without requiring a degree in computer science to learn. OK, few of the tools I’ve seen require a computer science degree but most people are notoriously lazy so any barrier to entry is too much. I’ve been using and recommending Wickr for a few months now because of its relative ease of use. It’s a good tool but there are two major flaws in my opinion. First, it’s not open source. Second, it requires a separate user name and password, which is a surprisingly high barrier to entry for some (I’m talking about people with little security knowledge).

For a while Android users have enjoyed Red Phone for secure phone calls and TextSecure for secure text messages. Some time ago an app called Signal was released that gave iOS users the ability to call Red Phone users but there was no app that was compatible with TextSecure. Since some of the people I talk to use Android and others use iOS I really needed a solution that was cross platform. Fortunately the developers of Signal, Red Phone, and TextSecure just released an update to Signal that enables secure text messaging.

It’s a very slick application. First of all it, along with every other project developed by Open Whisper Systems, is open source. While being open source isn’t a magic bullet it certainly does make verifying the code easier (and by easier I mean possible). The other thing I like is that it uses your phone number to register your app with Open Whisper System’s servers. That means people can see if you have the app installed by looking up your number, which is magically pulled from your contacts list, in the app. If it’s installed on your end the app will let them send you text messages or call you. There are no user names or passwords to fiddle with so the barrier to entry is about as low as you can go.

Signal isn’t a magic bullet (no secure communication tools are). For example, since it’s tied to your phone number it doesn’t preserve your anonymity. Wickr, by allowing you to use a separate user name, does a better job in that department although it’s still not as good as it could be since it doesn’t attempt to anonymize traffic through something like Tor. Messages also aren’t set to self-destruct in a set amount of time like Wickr’s messages do. But it certainly fulfills some of my requirements when talking with people who aren’t technically knowledgeable or are just plain lazy.

David Cameron Joins the Legion of Naive People Who Think They Can Stop the Progress of Technology

David Cameron, the fascist prime minister of the United Kingdom, has decided that us serfs have no need for secure communications. He has expressed a desire to make the use of end-to-end encrypted communications illegal:

The prime minister has pledged anti-terror laws to give the security services the ability to read encrypted communications in extreme circumstances. But experts say such access would mean changing the way internet-based messaging services such as Apple’s iMessage or Facebook’s WhatsApp work.

This is just another battle in the crypto wars that have been waged between the state and the people. Needless to say the state hasn’t been faring so well. Nobody should be surprised by this though. History is littered with examples of power hungry despots trying to control commonly available technology and failing miserably. For example, the Inquisition was very interested in controlling access to printing presses in order to prevent the spread of anti-Church literature. It didn’t end well for them.

Today states are interested in restricting our access to secure communications. We’re told that these restrictions are necessary for the state to keep us safe but history has shown that such restrictions are put into place to bolster the state’s power. History has also shown us that any restrictions unpopular with the people fail in time.

Secure communication tools are now so pervasive that they cannot help but hold popular support. Nobody wants to transmit their authentication credentials in a way that anybody can intercept them (and if the state can intercept them then anybody can). People suffering from embarrassing medical conditions don’t want the world to know about it when they’re searching for related material online. And few people want others to know what kind of porn they watch.

We have need for secure communications and the tools to enable it are widely available. That means Cameron’s desires cannot be realized. Even if he passes a law making end-to-end encryption illegal people will use it coupled with anonymity tools to protect themselves from prosecution. You can’t put the djinn back in the bottle once it’s out no matter how many laws you pass. The fact that Cameron doesn’t realize this shows how delusional of his power he truly is.

Touch ID

When I was young I was an early adopter. I had to have every new gadget as soon as it was released. Because of that I was also a beta tester. Now that I’m older and don’t have the time to dick around with buggy products I wait until early adopters have played with a device for a while before purchasing it. The beta testers for the iPhone 6 have done a fantastic job as far as I can see so I finally upgrade to one.

I’m not too thrilled about the increased size but it’s not so big as to be difficult to use (unlike the iPhone 6 Plus, which combines all of the worst features of a phone and tablet into one big mistake). Other than the size it’s basically like previous iPhones but with added processing power and storage. Since I was upgrading from an iPhone 5 I also gained access to Touch ID, Apple’s finger print authentication system.

Let me preface what I’m about to say with an acknowledgement of how poor fingerprints are as a security token. When you use your fingerprint for authentication you are literally leaving your authentication token on everything you touch. That means a threat can not only get your authentication token but can do so at their leisure. Once a threat has your fingerprint there’s nothing you can do to change it.

With that disclaimer out of the way I must admit that I really like Touch ID. Fingerprints may not be the best authentication method in existence but all of us make security tradeoffs of some sort every day (since the only truly secure computer is one that cannot be used). Security and convenience are mutually exclusive. This is probably the biggest reason so many people are apathetic about computer security. But I think Touch ID does a good job of finding that balance between security and convenience.

Until Apple implemented Touch ID the only two options you had for security your iPhone were a four digit PIN or a more complex password. A phone is a device you pull out and check numerous times throughout the day and usually those checks are a desire to find some small bit of information quickly. That makes complex passwords, especially on a touchscreen keyboard, a pain in the ass. Most people, if they have any form of security on their phone at all, opt for a four digit PIN. Four digit PINs keep out only the most apathetic attackers. If you want to be secure against a threat that is willing to put some work into cracking your device you need something more secure.

Touch ID works as a secondary method of authentication. You still need to have a four digit PIN or a password on the device. That, in my opinion, is the trick to Touch ID being useful. If you reboot your phone you will need to authenticate with your four digit PIN or password. Until that first authentication after boot up Touch ID is not available. Another way to make Touch ID unavailable is not to log into your phone for 48 hours.

The Fifth Amendment does not protect you from surrendering your fingerprint to the police. That means law enforcers can compel you to give your fingerprint so they can unlock your phone. Whether passwords are protected by the Fifth Amendment is a topic still being fought in the courts. If you’re arrested a password is going to be a better method of securing your device from the state than your fingerprint. Because of how Touch ID works you can thwart law enforcement’s ability to take your fingerprint by simply powering off the phone.

Only you can decide if Touch ID is an appropriate security mechanism for you. I’m really enjoying it because now I can have a complex password on my phone without having to type it in every time I pull it out of my pocket. But I also admit that fingerprints are poor authentication mechanisms. Tradeoffs are a pain in the ass but they’re the only things that make our electronic devices usable.

Encryption Works Except When It Doesn’t

People are still debating whether Edward Snowden is a traitor deserving a cage next to Chelsey Manning or a hero deserving praise (hint, unless you believe the latter you’re wrong). But a benefit nobody can deny is the overall improvement to computer security his actions have lead to. In addition to more people using cryptographic tools we are also getting a better idea of what tools work and what tools don’t work:

The NSA also has “major” problems with Truecrypt, a program for encrypting files on computers. Truecrypt’s developers stopped their work on the program last May, prompting speculation about pressures from government agencies. A protocol called Off-the-Record (OTR) for encrypting instant messaging in an end-to-end encryption process also seems to cause the NSA major problems. Both are programs whose source code can be viewed, modified, shared and used by anyone. Experts agree it is far more difficult for intelligence agencies to manipulate open source software programs than many of the closed systems developed by companies like Apple and Microsoft. Since anyone can view free and open source software, it becomes difficult to insert secret back doors without it being noticed. Transcripts of intercepted chats using OTR encryption handed over to the intelligence agency by a partner in Prism — an NSA program that accesses data from at least nine American internet companies such as Google, Facebook and Apple — show that the NSA’s efforts appear to have been thwarted in these cases: “No decrypt available for this OTR message.” This shows that OTR at least sometimes makes communications impossible to read for the NSA.

Things become “catastrophic” for the NSA at level five – when, for example, a subject uses a combination of Tor, another anonymization service, the instant messaging system CSpace and a system for Internet telephony (voice over IP) called ZRTP. This type of combination results in a “near-total loss/lack of insight to target communications, presence,” the NSA document states.

[…]

Also, the “Z” in ZRTP stands for one of its developers, Phil Zimmermann, the same man who created Pretty Good Privacy, which is still the most common encryption program for emails and documents in use today. PGP is more than 20 years old, but apparently it remains too robust for the NSA spies to crack. “No decrypt available for this PGP encrypted message,” a further document viewed by SPIEGEL states of emails the NSA obtained from Yahoo.

So TrueCrypt, OTR, PGP, and ZRTP are all solid protocols to utilize if you want to make the National Security Agency’s (NSA) job of spying on you more difficult. It’s actually fascinating to see that PGP has held up so long. The fact that TrueCrypt is giving the NSA trouble makes the statement of its insecurity issued by the developers more questionable. And people can finally stop claiming that Tor isn’t secure due to the fact it started off as a government project. But all is not well in the world of security. There are some things the NSA has little trouble bypassing:

Even more vulnerable than VPN systems are the supposedly secure connections ordinary Internet users must rely on all the time for Web applications like financial services, e-commerce or accessing webmail accounts. A lay user can recognize these allegedly secure connections by looking at the address bar in his or her Web browser: With these connections, the first letters of the address there are not just http — for Hypertext Transfer Protocol — but https. The “s” stands for “secure”. The problem is that there isn’t really anything secure about them.

[…]

One example is virtual private networks (VPN), which are often used by companies and institutions operating from multiple offices and locations. A VPN theoretically creates a secure tunnel between two points on the Internet. All data is channeled through that tunnel, protected by cryptography. When it comes to the level of privacy offered here, virtual is the right word, too. This is because the NSA operates a large-scale VPN exploitation project to crack large numbers of connections, allowing it to intercept the data exchanged inside the VPN — including, for example, the Greek government’s use of VPNs. The team responsible for the exploitation of those Greek VPN communications consisted of 12 people, according to an NSA document SPIEGEL has seen.

How the NSA is able to bypass VPN and HTTPS is still in question. I’m guessing the NSA’s ability to break HTTPS depends on how it’s implemented. Many sites, including ones such as Paypal, fail to implement HTTPS in a secure manner. This may be an attempt to maintain backward compatibility with older systems or it may be incompetence. Either way they certainly make the NSA’s job easier. VPN, likewise, may be implementation dependent. Most VPN software is fairly complex, which makes configuring it in a secure manner difficult. Like HTTPS, it’s easy to put up a VPN server that’s not secure.

The ultimate result of this information is that the tools we rely on will become more secure as people address the weaknesses being exploited by the NSA. Tools that cannot be improved will be replaced. Regardless of your personal feelins about Edward Snowden’s actions you must admit that they are making the Internet more secure.

Abusers Installing Spyware on Their Victims’ Computers

Last month I briefly mentioned the importance of full disk encryption. Namely it prevents the contents of the hard drive from being altered unless one knows the decryption key. I had to deal with a friend’s significant other installing spyware on her system in order to keep tabs on who she was talking to and what she was doing. Her significant other didn’t know her login credentials but since her hard drive wasn’t encrypted he was able to install the spyware with a boot disk. This threat model isn’t out of the ordinary. In fact it is becoming worryingly common:

Helplines and women’s refuge charities have reported a dramatic rise in the use of spyware apps to eavesdrop on the victims of domestic violence via their mobiles and other electronic devices, enabling abusers clandestinely to read texts, record calls and view or listen in on victims in real time without their knowledge.

The Independent has established that one device offering the ability to spy on phones is being sold by a major British high-street retailer via its website. The proliferation of software packages, many of which are openly marketed as tools for covertly tracking a “cheating wife or girlfriend” and cost less than £50, has prompted concern that police and the criminal justice system in Britain are failing to understand the extent of the problem and tackle offenders.

A survey by Women’s Aid, the domestic violence charity, found that 41 per cent of domestic violence victims it helped had been tracked or harassed using electronic devices. A second study this year by the Digital Trust, which helps victims of online stalking, found that more than 50 per cent of abusive partners used spyware or some other form of electronic surveillance to stalk their victims.

As a general rule security is assumed to be broken when an adversary has physical access. But that isn’t always the case. It really depends on how technically capable a threat is. Oftentimes in cases of domestic abuse the abuser is not technically savvy and relies on easy to procure and use tools to perform monitoring.

Full disk encryption, while not a magic bullet, is pretty effective at keeping less technically capable threats from altering a drive’s contents without the owner’s knowledge. When encrypting the contents of a hard drive is not possible, either due to technical limitations or the threat of physical violence, the Tails Linux live distribution is a good tool. Tails is being developed to maintain user anonymity and leave a few traces as possible that it was used. All Internet traffic on Tails is pumped through Tor, which prevents a threat monitoring your network from seeing what you’re looking at or who you’re talking to (but does not disguise the fact that you’re using Tor). That can enable a victim to communicate securely with an individual or group that can help. Since Tails boots from a USB stick or CD it can be easily removed and concealed.

As monitoring tools becomes easier to use, cheaper, and more readily available the need to learn computer security will become even greater. After all, the National Security Agency (NSA) isn’t the only threat your computer environment may be facing. Domestic abusers, corrupt (or “legitimate”) law enforcers, land lords, bosses, and any number of other people may with to spy on you for various reasons.

PGP On the iPhone

I’m a big fan of OpenPGP. Not only do I use it to sign and encrypt e-mails but I also use it to sign and encrypt files that I upload to services such as Dropbox and Amazon S3. But mostly due to a lack of time I didn’t have much luck finding a decent iPhone app for OpenPGP. The main problem is that all of the OpenPGP apps aren’t free and I don’t like spending money unless I know I’m getting a good product. I finally decided to drop a whopping $1.99 and try the app that had the best reviews, iPGMail.

Due to the limitations of the iPhone, namely it doesn’t let you write plugins for other apps, iPGMail and other iOS OpenPGP solutions aren’t as slick as something like GPGTools. But iPGMail is as easy to use as you’re going to get. You can either copy the encrypted body content of an e-mail and paste it into iPGMail to decrypt it or, if the encrypted e-mail came in as an attachment (which is what I always do), you can tap and hold on the attachment icon and the option of opening it with iPGMail will appear. Additionally you can encrypt and upload or download and decrypt files from Dropbox, which is a feature I appreciate.

The app allows you to generate 4096-bit keypairs or, more importantly to me, import an already existing keypair. Because my e-mail server lies in my apartment I just e-mailed my keypair (in an encrypted format, of course). When I opened the e-mail on the iPhone and tap and held the attachment icon I was able to open it in iPGMail and import it.

I’m not saying that this app is the best thing since sliced bread because I haven’t had a lot of time to play with it. But so far I like what I see and it has done everything I’ve wanted in an OpenPGP app on my iPhone.

The Impact of Edward Snowden

As if anybody had any questions about whether or not Edward Snowden’s actions resulted in a safer Internet we now have a survey with some interesting results:

There’s a new international survey on Internet security and trust, of “23,376 Internet users in 24 countries,” including “Australia, Brazil, Canada, China, Egypt, France, Germany, Great Britain, Hong Kong, India, Indonesia, Italy, Japan, Kenya, Mexico, Nigeria, Pakistan, Poland, South Africa, South Korea, Sweden, Tunisia, Turkey and the United States.” Amongst the findings, 60% of Internet users have heard of Edward Snowden, and 39% of those “have taken steps to protect their online privacy and security as a result of his revelations.”

[…]

I ran the actual numbers country by country, combining data on Internet penetration with data from this survey. Multiplying everything out, I calculate that 706 million people have changed their behavior on the Internet because of what the NSA and GCHQ are doing. (For example, 17% of Indonesians use the Internet, 64% of them have heard of Snowden and 62% of them have taken steps to protect their privacy, which equals 17 million people out of its total 250-million population.)

After we learned about the National Security Agency’s (NSA) massive domestic spying program a lot of people who previously didn’t care about security suddenly began showing an interest. I saw this firsthand when participating in several local CryptoParties. Past attempts to even get enough people to bother throwing one failed miserably but after Snowden let us all in on the game interest spiked. I’m still busy assisting people interested in computer security because of Snowden. And that’s just individuals who developer a personal interest. Many companies have greatly increased their security including Google, Apple, and Microsoft.

In addition to better security Snowden’s leaks have also been good for agorism.

So I think it’s pretty clear that Snowden’s actions ended up benefiting us all greatly.

At Least It’ll Be a Legal Surveillance State Now

A lot of people arguing against the National Security Agency’s (NSA) mass surveillance apparatus are doing so by pointing out its illegal nature. The Fourth Amendment and a bunch of other words of pieces of paper have been cited. It looks like our overlords in Washington DC have finally tired of hearing these arguments. They’re now using their monopoly on issuing decrees to make state spying totally legal in every regard:

Last night, the Senate passed an amended version of the intelligence reauthorization bill with a new Sec. 309—one the House never has considered. Sec. 309 authorizes “the acquisition, retention, and dissemination” of nonpublic communications, including those to and from U.S. persons. The section contemplates that those private communications of Americans, obtained without a court order, may be transferred to domestic law enforcement for criminal investigations.

To be clear, Sec. 309 provides the first statutory authority for the acquisition, retention, and dissemination of U.S. persons’ private communications obtained without legal process such as a court order or a subpoena. The administration currently may conduct such surveillance under a claim of executive authority, such as E.O. 12333. However, Congress never has approved of using executive authority in that way to capture and use Americans’ private telephone records, electronic communications, or cloud data.

There you have it, all those arguments about NSA spying being illegal can finally be put to rest!

This is why I don’t hold out any hope for political solutions. So long as you rely on your rulers to define what is and isn’t legal you are forever at their mercy. And they are very interested in keeping you under their boots. But technical solutions exist that can render widespread spying, if not entirely impotent, prohibitively expensive. Many have pointed out to me that if you are targeted by the government you’re fucked no matter what. That is true. If the government wants you dead it’s well within its power to kill you. The task is not to save yourself if you are being targeted though. What cryptography tools do is keep you from being a target and raising the costs involved in pursuing you if you become a target.

It costs very little for agencies such as the NSA to slurp up and comb through unencrypted data. Encrypted data is another story. Even if the NSA has the ability to break the encryption it has no way of knowing what encrypted data is useful and what encrypted data is useless without breaking it first. And breaking encryption isn’t a zero cost game. Most people arguing that the NSA can break encryption use supercomputers as their plot device. Supercomputers aren’t cheap to operate. They take a lot of electricity. There are also the costs involved of hiring cryptanalysts capable of providing the knowledge necessary to break encryption. People with such a knowledge base aren’t cheap and you need them on hand at all times because encryption is constantly improving. The bottom line is that the more encrypted data there is the more resources the state has to invest into breaking it. Anonymity tools add another layer of difficulty because even if you decrypt anonymous data you can’t tie it to anybody.

Widespread use of cryptography makes widespread surveillance expensive because the only way to find anything is to crack everything. Political solutions are irrelevant because even if the rules of today make widespread surveillance illegal the rulers of tomorrow can reverse that decision.

Maintaining Backwards Compatibility Isn’t Doing Users Any Favors

I’m kind of at a loss as to why so many major websites have failed to disables legacy protocols and ciphers on their websites. The only two reasons I can think of is that those companies employ lazy administrators or, more likely, they’re trying to maintain backwards compatibility.

The Internet has become more integrated into our lives since the 1990s. Since then a lot of software has ceased being maintained by developers. Windows XP, for example, is no longer supported by Microsoft. Old versions of Internet Explorer have also fallen to the wayside. But many websites still maintain backwards compatibility with old versions of Windows and Internet Explorer because, sadly, they’re still used in a lot of places. While maintaining backwards compatibility seems like a service to the customer it’s really a disservice.

The idea seems simple enough, customers don’t want to invest money in new systems so backwards compatibility should be maintained to cater to them. However this puts them, and customers with modern systems, at risk. Consider Windows XP with Internet Explorer 6, the classic example. They’re both ancient. Internet Explorer 6 is not only old and not only unmaintained but it has a history of being Swiss cheese when it comes to security. If your customers or employees are using Internet Explorer 6 then they’re at risk of having their systems compromised and their data stolen. Obviously that is a more extreme example but I believe it makes my point.

Currently TLS is at version 1.2 but many older browsers including Internet Explorer 7 through 10 only support TLS 1.0 and that is most likely the next protocol to fall. Once (because it’s a matter of when, not if) that happens anybody using Internet Explorer 7 through 10 will be vulnerable. Any website that keeps TLS 1.0 available after that point will be putting the data of those users at risk.

It’s 2014. Backwards compatibility needs to be discarded if it involves decreasing security. While this does inconvenience customers to some extent it isn’t nearly as inconvenient as identify theft or account highjackings. As a general rule I operate until the principle that I won’t support it if the manufacturer doesn’t support it. And if the manufacturer does support it but not well then I will stop bothering to support it as soon as that support requires decreasing security.

And as an aside we can dump unsecured HTTP connections now. Seriously. There is no purpose for them.

POODLE Attack Capable of Bypassing Some TLS Installations

SSLv3 is dead and POODLE killed it. After news of the attack was made public web administrators were urged to finally disable SSLv3 and only use TLS for secure communications. But the security gods are cruel. It turns out that some installations of TLS are vulnerable to the POODLE attack as well:

On Monday, word emerged that there’s a variation on the POODLE attack that works against widely used implementations of TLS. At the time this post was being prepared, SSL Server Test, a free service provided by security firm Qualys, showed that some of the Internet’s top websites—again, a list including Bank of America, VMware, the US Department of Veteran’s Affairs, and Accenture—are susceptible. The vulnerability was serious enough to earn all sites found to be affected a failing grade by the Qualys service.

Qualys’s SSL Labs testing tool is a wonderful piece of software. It tests for various SSL vulnerabilities including this new POODLE exploit. Using it I was able to confirm, quite happily, that this site is not vulnerable (check out that sexy A rating). But I’m a dick so I also checked a few other sites to see what everybody else was doing. My favorite result was Paypal’s gigantic F rating:

paypall-ssllabs-f-rating

Paypal is a major online transaction provider. You would think that their server administrators would be keeping everything as locked down as possible. But they’re apparently sleeping on the job. It should be embarrassing to a company like Paypal that a single individual running a few hobby sites has tighter security.

But if you administer any websites you should check your setup to make sure your security connections are up to snuff (and unsecured connections are disabled entirely because it’s 2014 and nobody should be communicating across the Internet in the clear).