Archive for the ‘Encrypt Everything’ Category
Earlier this week the United States Congress decided to repeal privacy protection laws that it had previous put into place on Internet Service Providers (ISP). While a lot of people have been wasting their time begging their
representatives masters with phone calls, e-mails, and petitions, private companies have begun announcing methods to actually protect their users’ privacy. In the latest example of this, Pornhub announced that it will turn on HTTPS across its entire site:
On April 4, both Pornhub and its sister site, YouPorn, will turn on HTTPS by default across the entirety of both sites. By doing so, they’ll make not just adult online entertainment more secure, but a sizable chunk of the internet itself.
The Pornhub announcement comes at an auspicious time. Congress this week affirmed the power of cable providers to sell user data, while as of a few weeks ago more than half the web had officially embraced HTTPS. Encryption doesn’t solve your ISP woes altogether—they’ll still know that you were on Pornhub—but it does make it much harder to know what exactly you’re looking at on there.
As the article points out, your ISP will still be able to tell that you accessed Pornhub, since Domain Name Server (DNS) lookups are generally not secured, but it won’t be able to see what content you’re accessing. As for DNS lookups, solutions are already being worked on to improve their security. Projects like DNSCrypt, which provides encrypted DNS lookups, are already available.
If you want to protect your privacy you can’t rely on the State’s regulations. First, the State is the worst offender when it comes to surveillance and the consequences of its surveillance are far worse. Sure, your ISP might sell some of your data but the State will send men with guns to your home to kidnap you and probably shoot your dog. Second, as this situation perfectly illustrates, government regulations are temporary. The government implemented the privacy regulations and then took them away. It may restore them again in the future but there’s no guarantee it won’t repeal them again. Any government solution is temporary at best.
Cryptography offers a permanent solution that can protect Internet users from both their snoopy ISP and government. HTTPS and DNSCrypt will continue to work regardless of the state of privacy regulations.
A while back I wrote a handful of introductory guides on using Pretty Good Privacy (PGP) to encrypt the content of your e-mails. They were well intentioned guides. After all, everybody uses e-mail so we might as well try to secure it as much as possible, right? What I didn’t stop to consider was the fact that PGP is a dead end technology for securing e-mails not because the initial learning curve is steep but because the very implementation itself is flawed.
I recently came across a blog post by Filippo Valsorda that sums up the biggest issue with PGP:
But the real issues I realized are more subtle. I never felt confident in the security of my long term keys. The more time passed, the more I would feel uneasy about any specific key. Yubikeys would get exposed to hotel rooms. Offline keys would sit in a far away drawer or safe. Vulnerabilities would be announced. USB devices would get plugged in.
A long term key is as secure as the minimum common denominator of your security practices over its lifetime. It’s the weak link.
Worse, long term keys patterns like collecting signatures and printing fingerprints on business cards discourage practices that would otherwise be obvious hygiene: rotating keys often, having different keys for different devices, compartmentalization. It actually encourages expanding the attack surface by making backups of the key.
PGP, in fact the entire web of trust model, assumes that your private key will be more or less permanent. This assumption leads to a lot of implementation issues. What happens if you lose your private key? If you have an effective backup system you may laugh at this concern but lost private keys are the most common issue I’ve seen PGP users run into. When you lose your key you have to generate a new one and distribute it to everybody you communicate with. In addition to that, you also have to resign people’s existing keys. But worst of all, without your private key you can’t even revoke the corresponding published public key.
Another issue is that you cannot control the security practices of other PGP users. What happens when somebody who signed your key has their private key compromised? Their signature, which is used by others to decide whether or not to trust you, becomes meaningless because their private key is no longer confidential. Do you trust the security practices of your friends enough to make your own security practices reliant on them? I sure don’t.
PGP was a jury rigged solution to provide some security for e-mail. Because of that it has many limitations. For starters, while PGP can be used to encrypt the contents of a message it cannot encrypt the e-mail headers or the subject line. That means anybody snooping on the e-mail knows who the parties communicating are, what the subject is, and any other information stored in the headers. As we’ve learned from Edward Snowden’s leaks, metadata is very valuable. E-mail was never designed to be a secure means of communicating and can never be made secure. The only viable solution for secure communications is to find an alternative to e-mail.
With that said, PGP itself isn’t a bad technology. It’s still useful for signing binary packages, encrypting files for transferring between parties, and other similar tasks. But for e-mail it’s at best a bandage to a bigger problem and at worst a false sense of security.
I’m always on the lookout for good guides on privacy and security for beginner’s. Ars Technica posted an excellent beginner’s guide yesterday. It covers the basics; such as installing operating system and browser updates, enabling two-factor authentication, and using a password manager to enable you to use strong and unique passwords for your accounts; that even less computer savvy users can follow to improve their security.
If you’re not sure where to begin when it comes to security and privacy take a look at Ars’ guide.
The developers behind Signal, an application that allows you to send secure text messaging and make secure phone calls, released a Chrome app some time ago. The Chrome app allowed you to link your Android device with the app so you could use Signal on a desktop or laptop computer. iOS users were left out in the cold, which annoyed me because I spend more time on my laptop than on my phone (also, because I hate typing on my phone). Fortunately, Signal for iOS now supports linking with the Chrome app.
It’s simple to setup and works well. If you, like me, don’t use Chrome as your primary browser and don’t want to open it just to use Signal you can right-click on the Signal App in Chrome and create a shortcut. On macOS the shortcut will be created in your ~/Applications/Chrome Apps/ folder (I have no idea where it puts it on Windows or Linux). Once created you can drag the Signal shortcut to the dock.
You really need to use full disk encryption on all of your electronic devices. Modern versions of OS X and Linux make it easy. Windows is a bit hit or miss as BitLocker tries its damnedest to share your key with Microsoft’s servers. iOS has included full disk encryption by default — so long as you set a password — since version 8 and Android also includes support for full disk encryption. Use these tools because the Bill of Rights won’t protect your data from government snoops:
The government can prosecute and imprison people for crimes based on evidence obtained from their computers—even evidence retained for years that was outside the scope of an original probable-cause search warrant, a US federal appeals court has said in a 100-page opinion paired with a blistering dissent.
The 2nd US Circuit Court of Appeals ruled that there was no constitutional violation because the authorities acted in good faith when they initially obtained a search warrant, held on to the files for years, and built a case unrelated to the original search.
The case posed a vexing question—how long may the authorities keep somebody’s computer files that were obtained during a search but were not germane to that search? The convicted accountant said that only the computer files pertaining to his client—who was being investigated as part of an Army overbilling scandal—should have been retained by the government during a 2003 search. All of his personal files, which eventually led to his own tax-evasion conviction, should have been purged, he argued.
From my layman’s understanding of the Fourth Amendment, it’s supposed to protect against government shenanigans such as snooping through your data that was obtained under a valid warrant but was unrelated to the case the warrant was issued for to build another case against you. Although the quote is most likely false, Mr. Bush supposedly said, “It’s just a goddamned piece of paper!” in regards to the Constitution. While the quote is probably false the statement is not.
The Constitution cannot protect you. It is literally a piece of paper with words written on it. If you want some semblance of protection against the State you have to implement it yourself. Encrypting your devices’ storage would guard against this kind of nonsense assuming you weren’t foolish enough to decrypt the data for the State at any point. This is where features such as VeraCrypt’s (a fork of TrueCrypt that is being actively developed) hidden partition feature are nice because you can have a sanitized encrypted partition that you can decrypt and a hidden partition with your sensitive data. Since the hidden partition isn’t detectable the State’s agents cannot know whether or not it exists and therefore cannot compel you to decrypt it.
Utilize the tools available to you to protect yourself. Anybody who has been paying attention to recent American history knows that the supposed legal protections we all enjoy are little more than fiction at this point.
Playing off of my post from earlier today, I feel that it’s time to update Heinlein’s famous phrase. Not only is an armed society a polite society but an encrypted society is a polite society.
This article in Vice discusses the importance of encryption to the Lesbian, Gay, Bisexual, and Transgender (LGBT) communities but it’s equally applicable to any oppressed segment of a society:
Despite advances over the last few decades, LGBTQ people, particularly transgender folks and people of color, face alarming rates of targeted violence, housing and job discrimination, school and workplace bullying, and mistreatment by law enforcement. In the majority of US states, for example, you can still be legally fired just for being gay.
So while anyone would be terrified about the thought of their phone in the hands of an abusive authority figure or a jealous ex-lover, the potential consequences of a data breach for many LGBTQ people could be far more severe.
LGBTQ people around the world depend on encryption every day to stay alive and to protect themselves from violence and discrimination, relying on the basic security features of their phones to prevent online bullies, stalkers, and others from prying into their personal lives and using their sexuality or gender identity against them.
In areas where being openly queer is dangerous, queer and trans people would be forced into near complete isolation without the ability to connect safely through apps, online forums, and other venues that are only kept safe and private by encryption technology.
These situations are not just theoretical. Terrifying real life examples abound, like the teacher who was targeted by for being gay, and later fired, after his Dropbox account was hacked and a sex video was posted on his school’s website. Or the time a Russian gay dating app was breached, likely by the government, and tens of thousands of users received a message threatening them with arrest under the country’s anti-gay “propaganda” laws.
Systematic oppression requires information. In order to oppress a segment of the population an oppressor must be able to identify members of that segment. A good, albeit terrifying, example of this fact is Nazi Germany. The Nazis actually made heavy use of IBM counting machines to identify and track individuals it declared undesirable.
Today pervasive surveillance is used by state and non-state oppressors to identify those they wish to oppress. Pervasive surveillance is made possible by a the lack of the use of effective encryption. Encryption allows individuals to maintain the integrity and confidentiality of information and can be used to anonymize information as well.
For example, without encryption it’s trivial for the State to identify transgender individuals. A simple unencrypted text message, e-mail, or Facebook message containing information that identifies an individual a transgender can either be read by an automated surveillance system or acquired through a court order. Once identified an agent or agents can be tasked with keeping tabs on that individual and wait for them to perform an act that justified law enforcement involvement. Say, for example, violating North Carolina’s idiotic bathroom law. After the violation occurs the law enforcement agents can be sent in to kidnap the individual so they can be made an example of, which would serve to send a message of terror to other transgender individuals.
When data is properly encrypted the effectiveness of surveillance is greatly diminished. That prevents oppressors from identifying targets, which prevents the oppressors from initiating interactions entirely. Manners are good when one may have to back up his acts with his life. Manners are better when one doesn’t have to enter into conflict in the first place.
Me right now.
A lot of people are claiming Apple’s decision to fight the Federal Bureau of Investigations (FBI) is nothing more than a marketing play. But I swear that I can hear Tim Cook yelling, “Fuck the police!” because his company keeps making announcements that it’s going to make its products more secure:
WASHINGTON — Apple engineers have begun developing new security measures that would make it impossible for the government to break into a locked iPhone using methods similar to those now at the center of a court fight in California, according to people close to the company and security experts.
The company first raised the prospect of a security update last week in a phone call with reporters, who asked why the company would allow firmware — the software at the heart of the iPhone — to be modified without requiring a user password.
One senior executive, speaking on the condition of anonymity, replied that it was safe to bet that security would continue to improve. Separately, a person close to the company, who also spoke on the condition of anonymity, confirmed this week that Apple engineers had begun work on a solution even before the San Bernardino attack. A company spokeswoman declined to comment on what she called rumors and speculation.
Independent experts say they have held informal conversations with Apple engineers over the last week about the vulnerability. Exactly how Apple will address the issue is unclear. Security experts who have been studying Apple’s phone security say it is technically possible to fix.
In addition to senior executives talking about upcoming security enhancements, Apple has also added an interesting figure to its payroll:
I'm delighted to announce that I accepted an offer to be working with the CoreOS security team at Apple this summer.
— Frederic Jacobs (@FredericJacobs) February 25, 2016
Frederic Jacobs, for those who don’t know, was one of the developer of the iOS version of Signal, the secure messaging application created by Open Whisper Systems that I highly recommend.
It seems to me that Apple is doing more than marketing here. The company seems dedicated to offering a secure product to its customers. My greatest hope is that this encourages other companies to follow suit.
Can you trust a network you don’t personally administer? No. The professors at the University of California are learning that lesson the hard way:
“Secret monitoring is ongoing.”
Those ominous words captured the attention of many faculty members at the University of California at Berkeley’s College of Natural Resources when they received an email message from a colleague on Thursday telling them that a new system to monitor computer networks had been secretly installed on all University of California campuses months ago, without letting any but a few people know about it.
“The intrusive device is capable of capturing and analyzing all network traffic to and from the Berkeley campus, and has enough local storage to save over 30 days of *all* this data (‘full packet capture’). This can be presumed to include your email, all the websites you visit, all the data you receive from off campus or data you send off campus,” said the email from Ethan Ligon, associate professor of agricultural and resource economics. He is one of six members of the Academic Senate-Administration Joint Committee on Campus Information Technology.
When you control a network it’s a trivial matter to setup monitoring tools. This is made possible by the fact many network connects don’t utilize encryption. E-mail is one of the biggest offenders. Many e-mail server don’t encrypt traffic being sent so any network monitoring tools can’t read the contents. Likewise, many websites still utilize unencrypted connections so monitoring tools can easily read what is being sent and received between a browser and a web server. Instant messaging protocols often transmit data in the clear as well so monitoring tools can read entire conversations.
It’s not feasible to only use networks you control. A network that doesn’t connect to other networks is very limited in use. But there are tools to mitigate the risks associated with using a monitored network. For example, I run a Virtual Private Network (VPN) server that encrypts traffic between itself and my devices. When I connect to it all of my traffic goes through the encrypted connection so local network monitoring tools can’t snoop on my connects. Another tools that works very well for websites is the Tor Browser. The Tor Browser sends all traffic through an encrypted connection to an exit node. While the exit node can snoop on any unencrypted connections local monitoring tools cannot.
Such tools wouldn’t be as necessary to maintain privacy though if all connections utilized effective encryption. E-mail servers, websites, instant messengers, etc. can encrypt traffic and often do. But the lack of ubiquitous encryption means monitoring tools can still collect some data on you.
Because I advocate apolitical action to achieve change in the world I periodically get political types snidely asking, “Well what have you done for liberty?” It’s a fair question. My recent efforts have been primarily focused on teaching people how to defend themselves online. Fortunately I’m not alone. I’ve been working with some phenomenal people to run CryptoPartyMN, and organization created specifically to teach people how to use security means of communications.
Our work hasn’t gone unnoticed either. A few weeks ago James Shiffer from the Star Tribune contacted us. He was working on an article covering Crypto War II and wanted to interview members of CryptoPartyMN to understand the counterarguments to the State’s claims that effective cryptography puts everybody at risk. In addition to interviewing several of us he also attended the last CryptoParty. The result was this article. As you can tell from the article we’ve got everything you could possibly want:
The three CryptoParty presenters were Burg, 32, a Twin Cities software developer and Second Amendment supporter whose blog is called “A Geek With Guns.” The two others are cannabis activists Cassie Traun, 26, an IT professional who “never really trusted the government,” and Kurtis Hanna, 30, an unsuccessful candidate for Minneapolis mayor and state Legislature who said he became interested in the issue after the revelations of NSA spying.
Guns, weed, and crypto. Between the three of us we’ve got pretty much every important freedom issue covered!
So, yeah, that’s one of the things I’ve been up to.
Unfortunately there are a lot of websites that still aren’t utilizing HTTPS to ensure confidential and unaltered communications between them and their users. One of the excuses often given by website administrators for not using HTTPS is that certificates cost money. Another excuse is that managing certificates is a huge pain in the ass.
StartSSL has been providing free certificates for years but administrators still have to manually manage them. A while ago a group of people decided to kill both birds with a single stone and began work on Let’s Encrypt. Let’s Encrypt is a certificate authority and software package that work together to provide automatically managed certificate to websites. It’s been in closed beta for a while and starting December 3rd it will be making the beta test available to the public.
This means anybody wanting a certificate will be able to request one. It also means there will no longer be any excuses for websites not to implement HTTPS. And with the ever more pervasive surveillance state it’s absolutely necessary to make HTTPS the default.