The Privacy Dangers of Body Camera Equipped Police

I’ve been how ineffective body cameras on police will be but after seeing some of the things posted by my friend Kurtis Hannah I am now convinced that they will also bring a new wave of surveillance and privacy violations.

We already live in a world where much of our activity is recorded by cameras. Department stores, gas stations, hospitals, and pretty much everywhere else employ security cameras. While I don’t like all being recorded at these places I also acknowledge that they won’t send men with guns after me unless I’ve done something legitimately bad in most cases (because that’s usually the only time the footage is reviewed). Police footage, especially in this day and age where the National Security Agency (NSA) already has a massive surveillance apparatus, could be employed differently. It’s not unimaginable that police departments would employ people to review all footage from body cameras to find potential criminal offenses that the officer missed. Such a large amount of footage could also enable police to track individuals by using facial recognition software against body camera footage. That wouldn’t be unprecedented since many departments already do something similar with automatic license plate scanners.

This puts us in a really bad spot. On the one hand we cannot trust the police to go about their activities unsupervised. Having their actions recorded at all times while they’re on duty and streaming that footage live for anybody to access at any point is the only way any semblance of accountability can exist. But doing that will also violate the privacy of anybody within camera shot of an officer.

What’s the solution? In my opinion the only viable solution is to toss out the entire institution of modern policing and replace it with something better. That something better will have to be decentralized by nature and not in any way associated with the state, which seems impossible to implement today due to the controlling nature of today’s state. But until that happens there will be no accountability and the only “solutions” offered to us will be ones that better enable the police to keep us under their boots.

PGP On the iPhone

I’m a big fan of OpenPGP. Not only do I use it to sign and encrypt e-mails but I also use it to sign and encrypt files that I upload to services such as Dropbox and Amazon S3. But mostly due to a lack of time I didn’t have much luck finding a decent iPhone app for OpenPGP. The main problem is that all of the OpenPGP apps aren’t free and I don’t like spending money unless I know I’m getting a good product. I finally decided to drop a whopping $1.99 and try the app that had the best reviews, iPGMail.

Due to the limitations of the iPhone, namely it doesn’t let you write plugins for other apps, iPGMail and other iOS OpenPGP solutions aren’t as slick as something like GPGTools. But iPGMail is as easy to use as you’re going to get. You can either copy the encrypted body content of an e-mail and paste it into iPGMail to decrypt it or, if the encrypted e-mail came in as an attachment (which is what I always do), you can tap and hold on the attachment icon and the option of opening it with iPGMail will appear. Additionally you can encrypt and upload or download and decrypt files from Dropbox, which is a feature I appreciate.

The app allows you to generate 4096-bit keypairs or, more importantly to me, import an already existing keypair. Because my e-mail server lies in my apartment I just e-mailed my keypair (in an encrypted format, of course). When I opened the e-mail on the iPhone and tap and held the attachment icon I was able to open it in iPGMail and import it.

I’m not saying that this app is the best thing since sliced bread because I haven’t had a lot of time to play with it. But so far I like what I see and it has done everything I’ve wanted in an OpenPGP app on my iPhone.

The Impact of Edward Snowden

As if anybody had any questions about whether or not Edward Snowden’s actions resulted in a safer Internet we now have a survey with some interesting results:

There’s a new international survey on Internet security and trust, of “23,376 Internet users in 24 countries,” including “Australia, Brazil, Canada, China, Egypt, France, Germany, Great Britain, Hong Kong, India, Indonesia, Italy, Japan, Kenya, Mexico, Nigeria, Pakistan, Poland, South Africa, South Korea, Sweden, Tunisia, Turkey and the United States.” Amongst the findings, 60% of Internet users have heard of Edward Snowden, and 39% of those “have taken steps to protect their online privacy and security as a result of his revelations.”

[…]

I ran the actual numbers country by country, combining data on Internet penetration with data from this survey. Multiplying everything out, I calculate that 706 million people have changed their behavior on the Internet because of what the NSA and GCHQ are doing. (For example, 17% of Indonesians use the Internet, 64% of them have heard of Snowden and 62% of them have taken steps to protect their privacy, which equals 17 million people out of its total 250-million population.)

After we learned about the National Security Agency’s (NSA) massive domestic spying program a lot of people who previously didn’t care about security suddenly began showing an interest. I saw this firsthand when participating in several local CryptoParties. Past attempts to even get enough people to bother throwing one failed miserably but after Snowden let us all in on the game interest spiked. I’m still busy assisting people interested in computer security because of Snowden. And that’s just individuals who developer a personal interest. Many companies have greatly increased their security including Google, Apple, and Microsoft.

In addition to better security Snowden’s leaks have also been good for agorism.

So I think it’s pretty clear that Snowden’s actions ended up benefiting us all greatly.

Nothing Says Secure Communications Like a Backdoor

Since Snowden released the National Security Agency’s (NSA) dirty laundry security conscious people have been scrambling to find more secure means of communication. Most of the companies called out in the leaked documents have been desperately trying to regain the confidence of their customers. Google and Apple have enabled full device encryption on their mobile operating systems by default, many websites have either added HTTPS communications or have gone to exclusive HTTPS communications, and many apps have been released claiming to enable communications free from the prying eyes of Big Brother. Verizon decided to jump on the bandwagon but failed miserably:

Verizon Voice Cypher, the product introduced on Thursday with the encryption company Cellcrypt, offers business and government customers end-to-end encryption for voice calls on iOS, Android, or BlackBerry devices equipped with a special app. The encryption software provides secure communications for people speaking on devices with the app, regardless of their wireless carrier, and it can also connect to an organization’s secure phone system.

Cellcrypt and Verizon both say that law enforcement agencies will be able to access communications that take place over Voice Cypher, so long as they’re able to prove that there’s a legitimate law enforcement reason for doing so.

Security is an all or nothing thing. If you implement a method for law enforcement to access communications you also allow everybody else to access communications. Backdoors are purposely built weaknesses in the security capabilities of a software package. While developers will often claim that only authorized entities can gain access using a backdoor in reality anybody with the knowledge of how the backdoor works can use it.

Matters are made worse by the fact that law enforcement access is the problem everybody is trying to fix. The NSA was surveilling the American people in secret. A lot of people have also been questioning the amount of surveillance being performed by local law enforcement agencies. Since there is a complete absence of oversight and transparency nobody knows how pervasive the problem is, which means we must assume the worst case and act as if local departments are spying on everything they can. Tools like the one just released by Verizon don’t improve the situation at all.

Encryption as Agorism

Encryption as agorism is something I’ve been thinking about recently.

Agorism, at least in my not so humble opinion, involved both withholding resources from the state and making the state expend the resources it currently possesses. Bleed them dry and not allow a transfusion if you will.

Widespread surveillance is relatively cheap today because a lot of data is unencrypted. This is unfortunate because encryption greatly raises the resources necessary to implement a widespread surveillance system.

Let’s assume the conspiracy theorists are correct and the government is in possession of magical supercomputers derived from lizard people technology. Even with such a magical device the cost of breaking encryption is greater than the costs of viewing plaintext data. In order to even know whether or not encrypted data may be useful you must decrypt it. Until it’s decrypted you have no idea what you’ve collected. Is it a video? Is it a phone call? Is it an e-mail? Who knows!

Now let’s look at reality. Even if the state possesses powerful computers that can break encryption in a useful amount of time those systems aren’t cheap (if they were cheap we would all have them). Any system dedicated to breaking a piece of encrypted data is unable to be used for other tasks. That means the more encrypted data that needs to be broken the more supercomputers have to be operated. And supercomputers take a ton of power to operate. On top of that you also need cryptanalysts with the knowledge necessary to break encryption and they don’t work cheap (nor are they in abundance). Because encryption is constantly improve you need to keep those cryptanalysts on hand at all times. You also need coders capable of taking the cryptanalysts’ knowledge and turning it into software that can actually do the work. And I haven’t even gotten into the costs involved in maintaining, housing, and cooling the supercomputers.

The bottom line is using encryption can certainly be seen as a form of agorism if you’re operating under a surveillance state like we are in the United States. Spying on individuals using encrypted data requires far more resources than spying on individuals using plaintext communications. Therefore I would argue that agorists should work to ensure as much data as possible is encrypted.

At Least It’ll Be a Legal Surveillance State Now

A lot of people arguing against the National Security Agency’s (NSA) mass surveillance apparatus are doing so by pointing out its illegal nature. The Fourth Amendment and a bunch of other words of pieces of paper have been cited. It looks like our overlords in Washington DC have finally tired of hearing these arguments. They’re now using their monopoly on issuing decrees to make state spying totally legal in every regard:

Last night, the Senate passed an amended version of the intelligence reauthorization bill with a new Sec. 309—one the House never has considered. Sec. 309 authorizes “the acquisition, retention, and dissemination” of nonpublic communications, including those to and from U.S. persons. The section contemplates that those private communications of Americans, obtained without a court order, may be transferred to domestic law enforcement for criminal investigations.

To be clear, Sec. 309 provides the first statutory authority for the acquisition, retention, and dissemination of U.S. persons’ private communications obtained without legal process such as a court order or a subpoena. The administration currently may conduct such surveillance under a claim of executive authority, such as E.O. 12333. However, Congress never has approved of using executive authority in that way to capture and use Americans’ private telephone records, electronic communications, or cloud data.

There you have it, all those arguments about NSA spying being illegal can finally be put to rest!

This is why I don’t hold out any hope for political solutions. So long as you rely on your rulers to define what is and isn’t legal you are forever at their mercy. And they are very interested in keeping you under their boots. But technical solutions exist that can render widespread spying, if not entirely impotent, prohibitively expensive. Many have pointed out to me that if you are targeted by the government you’re fucked no matter what. That is true. If the government wants you dead it’s well within its power to kill you. The task is not to save yourself if you are being targeted though. What cryptography tools do is keep you from being a target and raising the costs involved in pursuing you if you become a target.

It costs very little for agencies such as the NSA to slurp up and comb through unencrypted data. Encrypted data is another story. Even if the NSA has the ability to break the encryption it has no way of knowing what encrypted data is useful and what encrypted data is useless without breaking it first. And breaking encryption isn’t a zero cost game. Most people arguing that the NSA can break encryption use supercomputers as their plot device. Supercomputers aren’t cheap to operate. They take a lot of electricity. There are also the costs involved of hiring cryptanalysts capable of providing the knowledge necessary to break encryption. People with such a knowledge base aren’t cheap and you need them on hand at all times because encryption is constantly improving. The bottom line is that the more encrypted data there is the more resources the state has to invest into breaking it. Anonymity tools add another layer of difficulty because even if you decrypt anonymous data you can’t tie it to anybody.

Widespread use of cryptography makes widespread surveillance expensive because the only way to find anything is to crack everything. Political solutions are irrelevant because even if the rules of today make widespread surveillance illegal the rulers of tomorrow can reverse that decision.

Maintaining Backwards Compatibility Isn’t Doing Users Any Favors

I’m kind of at a loss as to why so many major websites have failed to disables legacy protocols and ciphers on their websites. The only two reasons I can think of is that those companies employ lazy administrators or, more likely, they’re trying to maintain backwards compatibility.

The Internet has become more integrated into our lives since the 1990s. Since then a lot of software has ceased being maintained by developers. Windows XP, for example, is no longer supported by Microsoft. Old versions of Internet Explorer have also fallen to the wayside. But many websites still maintain backwards compatibility with old versions of Windows and Internet Explorer because, sadly, they’re still used in a lot of places. While maintaining backwards compatibility seems like a service to the customer it’s really a disservice.

The idea seems simple enough, customers don’t want to invest money in new systems so backwards compatibility should be maintained to cater to them. However this puts them, and customers with modern systems, at risk. Consider Windows XP with Internet Explorer 6, the classic example. They’re both ancient. Internet Explorer 6 is not only old and not only unmaintained but it has a history of being Swiss cheese when it comes to security. If your customers or employees are using Internet Explorer 6 then they’re at risk of having their systems compromised and their data stolen. Obviously that is a more extreme example but I believe it makes my point.

Currently TLS is at version 1.2 but many older browsers including Internet Explorer 7 through 10 only support TLS 1.0 and that is most likely the next protocol to fall. Once (because it’s a matter of when, not if) that happens anybody using Internet Explorer 7 through 10 will be vulnerable. Any website that keeps TLS 1.0 available after that point will be putting the data of those users at risk.

It’s 2014. Backwards compatibility needs to be discarded if it involves decreasing security. While this does inconvenience customers to some extent it isn’t nearly as inconvenient as identify theft or account highjackings. As a general rule I operate until the principle that I won’t support it if the manufacturer doesn’t support it. And if the manufacturer does support it but not well then I will stop bothering to support it as soon as that support requires decreasing security.

And as an aside we can dump unsecured HTTP connections now. Seriously. There is no purpose for them.

POODLE Attack Capable of Bypassing Some TLS Installations

SSLv3 is dead and POODLE killed it. After news of the attack was made public web administrators were urged to finally disable SSLv3 and only use TLS for secure communications. But the security gods are cruel. It turns out that some installations of TLS are vulnerable to the POODLE attack as well:

On Monday, word emerged that there’s a variation on the POODLE attack that works against widely used implementations of TLS. At the time this post was being prepared, SSL Server Test, a free service provided by security firm Qualys, showed that some of the Internet’s top websites—again, a list including Bank of America, VMware, the US Department of Veteran’s Affairs, and Accenture—are susceptible. The vulnerability was serious enough to earn all sites found to be affected a failing grade by the Qualys service.

Qualys’s SSL Labs testing tool is a wonderful piece of software. It tests for various SSL vulnerabilities including this new POODLE exploit. Using it I was able to confirm, quite happily, that this site is not vulnerable (check out that sexy A rating). But I’m a dick so I also checked a few other sites to see what everybody else was doing. My favorite result was Paypal’s gigantic F rating:

paypall-ssllabs-f-rating

Paypal is a major online transaction provider. You would think that their server administrators would be keeping everything as locked down as possible. But they’re apparently sleeping on the job. It should be embarrassing to a company like Paypal that a single individual running a few hobby sites has tighter security.

But if you administer any websites you should check your setup to make sure your security connections are up to snuff (and unsecured connections are disabled entirely because it’s 2014 and nobody should be communicating across the Internet in the clear).

Everybody is In On the Surveillance Game

This has been a bad week for my laptop. Last week my battery gave up the ghost. On Sunday the hard drive died. Finally on Monday the spare hard drive I swapped into the laptop committed seppuku. Since the hard drive I dropped in on Sunday night was my last spare drive I had to make a trip to the local computer parts emporium to acquire another one. While searching through the hard drives I came across something rather funny:

western-digital-surveillance

That must be Western Digital’s National Security Agency (NSA) edition hard drive.

Also, as a side note, when it comes time to choose a name for your laptop don’t choose Loki. Just throwing that out there.

Irony at It’s Finest

Anonymity is very important, which is why I hold Tor’s developers in high regard. Tor has helped political dissidents in especially tyrannical regimes speak out, made the drug trade safer by raising a barrier of anonymity between buyers and sellers, and gives people with jealous significant others a way to keep their communications secret. So when I see somebody harass any of the Tor developers my initial reaction is “Fuck that guy!”

Well an unsavory dude decided to harass Andrea Shepard, one of Tor’s developers, and learned a lesson about how valuable online anonymity is:

What happens when you troll Tor developers hard? You get unmasked.

Towards the end of last week, a troll who had sent various aggressive tweets to a host of security experts and privacy advocates associated with the Tor project and browser, which enables online anonymity, had his identity exposed. To some, that may seem hypocritical. To others, it seems like justice.

Andrea Shepard, the Tor developer who uncovered the real identity of her troll, says she was being harassed on and off for a year by a range of tweeters, all believed to be the sockpuppets of one man. The main source of abuse came from a Twitter account @JbJabroni10, but others included @JbGelasius, @SnowdenNoffect, @LimitYoHangout, @HaileSelassieYo, @thxsnowman and @PsyOpSnowden.

[…]

Things came to a head when some lighter mockery was aimed at Shepard last week, using information the troll had gleaned from her LinkedIn profile and personal website.

Unfortunately for the troll, this gave Shepard an IP address belonging to an iPhone that used a work network at atlantichealth.org to access her site. She also had some job information through LinkedIn’s “profiles that viewed yours” feature.

After searching LinkedIn for anyone with the role at Atlantic Health, she came across two profiles: one which didn’t have a name connected to it, another for a man named Jeremy Becker. She then used the Spokeo service to search for Jeremy Beckers in New Jersey, and a search for pharmacist licensees, and found only one, which gave her the middle initial ‘T’ and a hometown of Princeton, New Jersey.

She also had his father’s name, Edward Becker, and was able to find a Twitter account @ebecker which followed @JbJabroni10 and an inactive one for @JoyBecker52, apparently matching his mother Joyce Becker. Shepard had her man.

And then on 28 November, seven of the Twitter accounts linked to Becker seemed to go dark. He’d been scared off the face of the internet, to the cheers of the pro-Tor and anti-troll crowds.

Now that’s justice porn. And it should prove to be a valuable lesson to others who feel it necessary to harass security professionals. If somebody’s job is developing one of the most successful online anonymity tools chances are pretty good that they know how to uncover personally identifiable information. After all, you need to know how an attack works in order to defend against it.