The Risks Of Backing Up To The Cloud

Online backup services are convenient and offer resilience. Instead of managing your own backup drives a cloud backup service can upload your data to the Internet automatically whenever you’re connected. If your house burns down you don’t lose your data either. But, as with most things in the universe, there are trade offs. By placing your data on somebody else’s server you lose control over it. This can be mitigated by encrypting your files locally before uploading them but sometimes that’s not an option as with Apple’s iCloud Backup for iOS:

“If the government laid a subpoena to get iMessages, we can’t provide it,” CEO Tim Cook told Charlie Rose back in 2014. “It’s encrypted and we don’t have a key.”

But there’s always been a large and often-overlooked asterisk in that statement, and its name is iCloud.

It turns out the privacy benefits Apple likes to talk about (and the FBI likes to complain about) basically disappear when iCloud Backup is enabled. Your messages, photos and whatnot are still protected while on your device and encrypted end-to-end while in transit. But you’re also telling your device to CC Apple on everything. Those copies are encrypted on iCloud using a key controlled by Apple, not you, allowing the company (and thus anyone who gets access to your account) to see their contents.

I don’t use iCloud Backup for precisely this reason. My backups are done locally on my computer. This brings me to my point: you need to fully understand the tools you use to hope to have any semblance of security. One weakness in your armor can compromise everything.

iMessage may be end-to-end encrypted but that doesn’t do you any good if you’re backing up your data in cleartext to somebody else’s server.

Your Device Is A Snitch

In addition to pervasive government surveillance there is also pervasive corporate surveillance. Corporate surveillance isn’t as concerning since corporations rarely murder the people they’re surveilling but it’s also more sinister because most of the people being surveilled unwittingly agreed to be. Mobile phones are a great example of this. A lot of people, including myself, find mobile phones incredibly useful. They allow you to communicate with friends and family in almost any location, provide remote Internet connectivity, can navigate you to your destination, etc. But the side effect of the technology allows your cellular provider to know your location. In addition to that many apps use location services provide by your phone’s operating system and hardware to pinpoint your location and report it to the developers.

Another reason corporate surveillance is sinister is because the State usually has access to the collected data either through secret agreements or warrants. Your devices may report to the developer on what you’re doing and the State may then gain access to the data to prosecute you. An example of this is a recent story of a woman who filed a rape claim that was proven to be false by data collected from her Fitbit:

In March, a Florida woman traveled to Lancaster, Pennsylvania where she stayed at her boss’s home, reports ABC 27. On a Tuesday, police were called to the home where they found overturned furniture, a knife and a bottle of vodka, according to Lancaster Online. Jeannine Risley told police she’d been sleeping and that she was woken up around midnight and sexually assaulted by a “man in his 30s, wearing boots.” However, Risley was wearing her Fitbit band at the time. She initially said that the Fitbit had been lost in the struggle, but police found it in a hallway and when they downloaded its activity, the device became a witness against her.

According to ABC 27, Risley handed the username and password for her Fitbit account over to police. What they found contradicted her account of what happened that night. Via Lancaster Online:

[A] Fitbit device Risley was wearing told a different story, the affidavit shows.

The device, which monitors a person’s activity and sleep, showed Risley was awake and walking around at the time she claimed she was sleeping.

In this case one could argue that the surveillance lead to a good outcome since it busted the wearer for making a false rape accusation. But surveillance has no morality. This could very well be used to prosecute somebody who was arrested of a drug crime. For example, heart rate data from a Fitbit could be used as evidence that somebody had taken a particular drug at a certain time. It could also be used, as it was in this case, to prove the person wasn’t asleep at the time they were accused to taking drugs.

I’m not going to tell you not to use these devices. They do provide a lot of desirable functionality for many people. However, they also provide some potentially negative side effects that users should be aware of. If you use these devices just make sure you understand the ramifications.

Democracy Has No Place In The Crypto Wars

AT&T’s CEO, Randall Stephenson, had some choice words for Apple’s CEO, Tim Cook. Namely, Stephenson doesn’t appreciate Cook’s stance on effective encryption:

AT&T CEO Randall Stephenson doesn’t think Apple CEO Tim Cook should be making long-term decisions around encryption that could ripple across the technology industry. “I don’t think it is Silicon Valley’s decision to make about whether encryption is the right thing to do,” he told The Wall Street Journal in an interview on Wednesday. “I understand Tim Cook’s decision, but I don’t think it’s his decision to make,” said Stephenson. “I personally think that this is an issue that should be decided by the American people and Congress, not by companies.”

I’m sure this has everything to do with Stephenson’s strong belief in democracy and nothing at all to do with his company’s surveillance partnership with the National Security Agency (NSA). But let’s address the issue of democracy.

Stephenson says that effective cryptography should be decided by the American people. Unless I’m missing something Tim Cook is an American citizen. His stance on effective cryptography is his decision. Therefore is position is decided by an American person. Furthermore, why should anybody outside of Apple have a voice in the company’s stance? Stephenson is an employee of AT&T so his opinion shouldn’t be relevant to Apple. Congress, likewise, isn’t employed by Apple so their opinions shouldn’t be relevant to Apple either. Democracy, outside of groups voluntarily decided to vote on matters involving only themselves, is bullshit. It’s a tool for people to inflict their will on others. In fact it may very well be the grossest form of might makes right our species has developed.

I understand Stephenson’s decision, part of his business relies on surveillance, but it’s not his decision to make. This is an issue that should be decided by those creating the tools. If Stephenson wants to insert backdoors into his company’s products that’s fine, I’ll simply avoid using his products. But his has no right to demand other companies follow suit.

Is Your Thermostat A Snitch

As a general rule I’m a huge fan of technology. But even I have major reservations with the so-called Internet of things (really just adding a chip to devices that were previously analog). It’s not that the ideas themselves are bad but there isn’t enough attention being paid to the implementations, especially from a security and privacy standpoint.

The Nest thermostat is one of the more popular regular household devices with a chip added to it. What’s not to like about a thermostat that automatically adjusts the temperature in your home based on when you are and aren’t there? Besides that software bug that drained the battery and caused people’s furnaces to shutdown. And the fact the bloody thing snitches on where your house is:

Researchers at Princeton University have found that, until recently, Alphabet’s popular Nest thermostat was leaking the zip code and location of its users over the internet. This data was transmitted unencrypted, or in the clear, meaning that anyone sniffing traffic could have intercepted it, according to the researchers.

The researchers also studied several other smart devices, including the Sharx security camera, a PixStar smart photoframe, and Samsung’s SmartThings Hub. The goal of their research wasn’t to find specific bugs in these devices, but to determine what information was being leaked when the devices communicated with their servers in the cloud.

I have no idea what a thermostat would need to even know where your house is. It needs to know the temperature inside and what you want the temperature to be at so it can order your climate control system to make the two numbers be the same. But it apparently does have access to that information and the developers cared so little about the privacy of their customers that they not only failed to keep the data private but didn’t even bother encrypting it when it was sent. And this isn’t an isolated incident. The complete disregard for these kind of details is plaguing the Internet of things market.

The Black Market Has You Covered

One of my favorite fairytales is the one about government regulations being able to restrict the proliferation of technology.

IMSI catchers are widely used by government law enforcers for surveillance. The devices, for those of you unfamiliar, act as cell towers and by so doing get local cell phones to connect to it instead of the legitimate cell towers. It’s a man in the middle attack that allows law enforcers to snoop any unencrypted data transmitted or received by a victim’s cell phone.

In the United States the use of such device by non-law enforcers is sternly frowned upon. With the Federal Communications Commission’s (FCC) restrictions on the civilian use of IMSI catchers you might be lead to think the devices are hard to acquire. Not so. There is one thing that always renders government restrictions on technology impotent: the black market:

Across a tinny Skype connection, a Hong Kong tech company is trying to sell us state surveillance equipment.

“I switched it on already,” says Edward Tian, holding up a backpack containing a box and wires. “This is the antenna. This is the battery […] Everything is this simple.”

It’s a $15,000 IMSI catcher operated via an Android app. Tian shows us the user interface in a grainy video. He hits a button on the app and information on a bunch of cellphones in the area trickles down the screen. He has their IMSI (International Mobile Subscriber Identity, a unique identifier for their SIM card), IMEI (International Mobile Equipment Identity—the same for their device), and even full phone numbers.

Any perceived control over a technology is nothing more than an illusion.

A Smaller Taser

It’s hard to argue against handguns being the most effective self-defense tool for the average person but there are many people, either through personal conviction (which is perfectly acceptable) or legal restraints (which is entirely unacceptable), that cannot carry one. I appreciate the market providing in-between solutions that improve an individual’s ability to defend themselves but don’t go as far as a firearm. Taser, which primarily targets law enforcement agencies, has announced a new Taser that is aimed at the civilian market. Overall I think it’s a pretty decent idea:

Additionally, the Pulse comes with rechargeable batteries, two live Taser cartridges, laser-assisted targeting and a 15-foot range. Most importantly, Taser says that if you end up using it for self-defense and leave it at the scene, the device will be replaced for free.

While the $399.00 price tag seems a bit steep for me since it’s approaching real handgun territory the free replacement program makes it a bit more palatable. In fact the free replacement program may be the best feature of this weapon. It gives a person who was just subjected to a self-defense situation one less thing to worry about. As far as size goes it’s in the compact handgun territory, which I believe is an excellent size for something aimed at regular people.

I hope we begin seeing more in-between self-defense tools aimed at regular individuals. They gives people who cannot or will not carry a firearm an option other than dying. And that increases the overall cost of committing violence.

News From The Crypto War Frontline In New York

I continue to be amused by politicians’ efforts to prohibit math. A bill has been introduce in New York that would require manufacturers to implement backdoors in their mobile devices or face… some kind of consequence, I guess:

A New York assemblyman has reintroduced a new bill that aims to essentially disable strong encryption on all smartphones sold in the Empire State.

Among other restrictions, the proposed law states that “any smartphone that is manufactured on or after January 1, 2016 and sold or least in New York, shall be capable of being decrypted and unlocked by its manufacturer or its operating system provider.”

If it passes both houses of the state legislature and is signed by the governor, the bill would likely be the first state law that would impose new restrictions on mobile-based cryptography. Undoubtedly, if it makes it that far, the law would likely face legal challenges from Apple and Google, among others.

One of the great things about democracy is if a vote doesn’t go the way you want you can reintroduce the vote and waste everybody’s time again.

One question you have to ask is how this bill could be enforced. As written, it would punish sellers who sold phones that couldn’t be decrypted by law enforcers. But New York isn’t that big of a landmass and Ars Technia points out the rather obvious flaw in Assemblyman Titone’s clever plan:

UPDATE 3:49pm ET: Also, it’s worth pointing out that even if this bill does pass, it wouldn’t be terribly difficult for New Yorkers to cross a state line to buy a smartphone.

It doesn’t take a rocket scientists to see what would happen if this bill was signed into law. Sellers in New York may go under but sellers in neighboring states would see a jump in sales. In addition to sellers in neighboring states, the sales of online stores would likely increase as well since, you know, you can just order a cell phone online and have it delivered to your home.

Part of me is amused by the idea of strong cryptography being outlawed. Imagine millions of Android users flashing customer firmware just so they could remove government mandated backdoors. Such a prohibition would almost certainly create a sizable black market for flashing customer firmware.

Is Your Device A Snitch

I’m convinced that one of the biggest threat to privacy is the reliance on advertisements many industries suffer from. This reliance has lead to a proliferation of surveillance technology. And now that the so-called Internet of Things (IoT) is the new hot commodity we’re seeing surveillance technology being embedded to more everyday things. With so many devices being capable of spying on you the next big thing in advertising has become cross-device surveillance. Bruce Schneier has an excellent article that shows just how far these advertisers are trying to go:

SilverPush is an Indian startup that’s trying to figure out all the different computing devices you own. It embeds inaudible sounds into the webpages you read and the television commercials you watch. Software secretly embedded in your computers, tablets, and smartphones pick up the signals, and then use cookies to transmit that information back to SilverPush. The result is that the company can track you across your different devices. It can correlate the television commercials you watch with the web searches you make. It can link the things you do on your tablet with the things you do on your work computer.

Your computerized things are talking about you behind your back, and for the most part you can’t stop them­ — or even learn what they’re saying.

Now white noise generators that broadcast on the frequencies used by this surveillance technology are suddenly good ideas for stocking stuffers. Without them your new smart fridge can display advertisements to you based on what your smart television told it you were watching.

Not only does this open the floodgates of privacy violations further but it also greatly increases the ability of malicious attackers. Ad networks have become major targets for malware distributors. This has created headaches for computer and smart phone users but now it could create headaches for your television, fridge, coffee maker, and even your damn doorbell. Making matters even worse is how unreliable IoT manufacturers are at both implementing and maintaining security. What happens when your smart fridge is considered out of date by the manufacturer and its software security problems are no longer fixed?

The reliance on advertising to fund so much technology is creating both a private and security nightmare. And it’s only getting worse.

Intellectual Property Means Not Owning Your Stuff

Intellectual property laws are always justified as being necessary for human innovation. Setting aside the fact humans have been innovating for longer than intellectual property laws have existed, the belief many people hold is that nobody would invest the resources necessary to innovate if they weren’t promised a monopoly on manufacturing afterwards. More and more though we’re seeing what the real purpose behind intellectual property laws are. It’s not to encourage innovation, it’s to curtail ownership.

Copyright is the biggest offender. Due to software copyright laws it’s getting more and more difficult to say you own anything because manufacturers are claiming anything with a computer in it is licensed, not sold. What’s that mean? It means when your product breaks down you are legally prohibited from fixing it:

How many people does it take to fix a tractor? A year ago, I would have said it took just one person. One person with a broken tractor, a free afternoon, and a box of tools.

I would have been wrong.

When the repair involves a tractor’s computer, it actually takes an army of copyright lawyers, dozens of representatives from U.S. government agencies, an official hearing, hundreds of pages of legal briefs, and nearly a year of waiting. Waiting for the Copyright Office to make a decision about whether people like me can repair, modify, or hack their own stuff.

[…]

Thanks to the “smart” revolution, our appliances, watches, fridges, and televisions have gotten a computer-aided intelligence boost. But where there are computers, there is also copyrighted software, and where there is copyrighted software, there are often software locks. Under Section 1201 of the DMCA, you can’t pick that lock without permission. Even if you have no intention of pirating the software. Even if you just want to modify the programming or repair something you own.

Enter the tractor. I’m not a lawyer. I’m a repairman by trade and a software engineer by education. I fix things—especially things with computers in them. And I run an online community of experts that teaches other people how to fix broken equipment. When a farmer friend of mine wanted to know if there was a way to tweak the copyrighted software of his broken tractor, I knew it was going to be rough. The only way to get around the DMCA’s restriction on software tinkering is to ask the Copyright Office for an exemption at the Section 1201 Rulemaking, an arduous proceeding that takes place just once every three years.

Ownership implies you have sole control over something. It can’t exist under intellectual property laws. So long as you stand the chance of being severely punished for repairing, modifying, or selling something you cannot claim to own it. Intellectual property claims are promises granted by the State that it will dish out those severe punishments.

This problem is also going to become exponentially worse as the number or products with embedded software increases exponentially. Soon we won’t be able to claim ownership over our refrigerators, coffee makers, or door bells. Everything in our homes will be rented property of the manufacturer. And if we violate the terms of the rental agreement the State will send its armed goons at oh dark thirty, kick down our doors announced, and shoot our pets.

The Pervasiveness Of Government Databases

Let’s discuss government databases. The United States government maintains numerous databases on its citizens. Many of these databases are populated, if not entirely, in part by algorithms. And unlike Amazon’s recommendation algorithms or Google’s search algorithms, government algorithms have real world consequences. Because government databases have become so pervasive these consequences can range from being barred from flying on a plane to signing up for the latest video game:

Last weekend Muhammad Zakir Khan, an avid gamer and assistant professor at Broward College in Florida, booted up his PC and attempted to sign up for Epic Games’ MOBA-inspired Paragon beta. Unbeknownst to Khan, however, was that his name name—-along with many others-—is on the US government’s “Specially Designated Nationals list,” and as such was blocked from signing up.

“Your account creation has been blocked as a result of a match against the Specially Designated Nationals list maintained by the United States of America’s Office of Foreign Assets control,” read the form. “If you have questions, please contact customer service at accounts@epicgames.com.”

There’s an interesting series of connections here. The first connection is Mr. Khan’s name appearing in the Specially Designated Nationals list. The second connection is the database, which is used to enforce the United States government’s various sanctions, applying to the Unreal 4 engine. The third connection is the game utilizing the Unreal 4 engine. In all likelihood Mr. Khan’s name was added to the database by an algorithm that adds anybody who has an arbitrarily selected number of characteristics that include such things as last names and religions.

So, ultimately, Mr. Khan was being prevented from signing up for a game because the government believes if they prevent modern video game technology from entering Iran, North Korea, or other countries under sanctions that the citizenry will start a revolution. Being human (or at least somewhat close approximations thereof) the agents charged with enforcing these sanctions chose to automate the process as much as possible, which resulted in a database likely automatically populated algorithmically.