One Step Forward, Two Steps Back

Were I asked I would summarize the Internet of Things as taking one step forward and two steps back. While integrating computers into everyday objects offers some potential the way manufacturers are going about it is all wrong.

Consider the standard light switch. A light switch usually has two states. One state, which closes the circuit, turns the lights on while the other state, which opens the circuit, turns the lights off. It’s simple enough but has some notable limitations. First, it cannot be controlled remotely. Having a remotely controlled light switch would be useful, especially if you’re away from home and want to make it appear as though somebody is there to discourage burglars. It would also be nice to verify if you turned all your lights off when you left to reduce the electric bill. Of course remotely operated switches also introduce the potential for remotely accessible vulnerabilities.

What happens when you take the worst aspects of connected light switches, namely vulnerabilities, and don’t even offer the positives? This:

Garrett, who’s also a member of the Free Software Foundation board of directors, was in London last week attending a conference, and found that his hotel room has Android tablets instead of light switches.

“One was embedded in the wall, but the two next to the bed had convenient looking ethernet cables plugged into the wall,” he noted. So, he got ahold of a couple of ethernet adapters, set up a transparent bridge, and put his laptop between the tablet and the wall.

He discovered that the traffic to and from the tablet is going through the Modbus serial communications protocol over TCP.

“Modbus is a pretty trivial protocol, and notably has no authentication whatsoever,” he noted. “Tcpdump showed that traffic was being sent to 172.16.207.14, and pymodbus let me start controlling my lights, turning the TV on and off and even making my curtains open and close.”

He then noticed that the last three digits of the IP address he was communicating with were those of his room, and successfully tested his theory:

“It’s basically as bad as it could be – once I’d figured out the gateway, I could access the control systems on every floor and query other rooms to figure out whether the lights were on or not, which strongly implies that I could control them as well.”

As far as I can tell the only reason the hotel swapped out mechanical light switches with Android tablets was to attempt to look impressive. What they ended up with was a setup that may look impressive to the layman but is every trolls dream come true.

I can’t wait to read a story about a 14 year-old turning off the lights to every room in a hotel.

Obama To South By Southwest: Fuck Your Privacy

I normally don’t follow South by Southwest too much but when Obama takes the stage to talk about privacy I can’t help but take note. Unfortunately his speech wasn’t surprising. It could be summed up as fuck your privacy:

President Barack Obama called on the tech community to build a safe encryption key to assist in law enforcement investigations, saying that if it failed, it could one day face a more draconian solution passed by a Congress that is less sympathetic to its worldview. The president said he could not comment on the FBI’s current fight with Apple over its demand that the company build software to unlock data on an iPhone used by one of the alleged San Bernardino shooters. But he spoke broadly about the need to balance privacy and security, and warned that absolutist views on both sides are dangerous.

Balance, in the case of privacy and security, means people like you and me get shitty crypto that the government, and anybody else with the master key, can break while the government gets to enjoy crypto we can’t break.

Obama warned against an absolutist view but crypto belongs to one of those very few things in the universe that is either black or white. There is no gray. Crypto is either effective, that is to say it has no known methods of attack that are faster than brute force, or it is ineffective. I’ve written extensively on this blog as to why this is.

The biggest problem with a master key is that anybody who holds that key can decrypt any data encrypted with a scheme that key can work for. If every iPhone was setup to decrypt the data with the government’s master key it would only be a matter of time, probably an alarmingly short period of time, before the key was leaked to the Internet and everybody in the world had the ability to decrypt any iPhone at will.

So we need an absolutist view because it’s the only view that offers any amount of security. But Obama heads one of the largest surveillance states in the world so it’s no surprise that he holds a total disregard for the security of us little people.

TANSTAAFL

Everything should be free is the attitude a lot of people hold towards software. If you charge $9.99 for an application you spent months writing and will spend years maintaining you’ll probably receive at least some backlash for having the audacity to charge for it. But the universal principle of TANSTAAFL, there ain’t no such thing as a free lunch, >applies even to software.

The developers of Caddy, a web server that I’ve admittedly never used, wrote an explaining why they’re asking for money. As it turns out, in spite of what many people who don’t develop software believe, creating and providing open source software involves some notable expenses:

Today you will notice an addition to the Download page: a “Payment” section. Is Caddy no longer free software?

The truth is, it never was. There’s no such thing as free software. The question is, “Who pays the price?”

In the case of Caddy, it has been the developers. The obvious problem with this is that it’s not sustainable in our economy.

[…]

In less than a year, Caddy has well over 20,000 downloads — many of which aren’t counted as the project is cloned and built locally and deployed to both development and production environments. We’ve accrued over 4,500 stars on GitHub, processed hundreds of pull requests, and have dozens of participants in our chat rooms. I can’t speak for other Caddy developers because donations are private, but thanks to very generous donors last year, our web hosting is paid (for now) and I’ve received a little over $150 for my time.

[…]

Keep in mind that commercial offerings for similar web servers cost anywhere from $80 one-time to $1900/yr. (And none of them do what Caddy does.) My text editor costs $70, even just your domain name probably costs ~$12/yr. (If you support us well enough, we’ll send you swag!)

Too many people, typically those who don’t develop software, have the attitude that all software should be free (as in price, but the ambiguity of the term free is why I refer to software with unburdened source code as open source software instead of free software). The app economy is a perfect example of this. It’s why many developers have moved towards nickel and diming customers with in-app purchases or selling a subscription service. When they tried to charge reasonable fees for their software up front people bitched. And now people are bitching because software developers are relying on in-app purchases and subscription services.

Too many people have gotten it into their heads that software should be free (again, as in price). Don’t fall into that trap. Software development incurs a lot of expenses. Time, computers, electricity, and web hosting are just a handful of things needed for software development and none of them are free.

As I said, I haven’t used Caddy. But it does seem to be popular so I’m going to assume it’s a quality product. That being the case, I do hope enough users begin paying for it to keep the developers afloat. It’s always sad to see a good software product fall into obscurity because the developers weren’t being compensated and had to abandon the project for something that actually paid the bills.

Illustrating Cryptographic Backdoors With Mechanical Backdoors

A lot of people don’t understand the concept of cryptographic backdoors. This isn’t surprising because cryptography and security are very complex fields of study. But it does lead to a great deal of misunderstanding, especially amongst those who tend to trust what government agents say.

I’ve been asked by quite a few people why Apple doesn’t comply with the demands of the Federal Bureau of Investigations (FBI). They’ve fallen for the FBI’s claims that the compromised firmware would only be used on that single iPhone and Apple would be allowed to maintain total control over the firmware at all times. However, as Jonathan Zdziarski explained, the burden of forensic methodology would require the firmware to exchange hands several times:

Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify.

[…]

If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.

If Apple creates what the FBI is demanding the firmware would almost certainly end up in the hands of NIST, the defense attorney, and another third party hired by the defense attorney to verify the firmware. As Benjamin Franklin said, “Three can keep a secret, if two of them are dead.” With the firmware exchanging so many hands it will almost certainly end up leaked to the public.

After pointing this out a common followup question is, “So what? How much damage could this firmware cause?” To illustrate this I will use an example from the physical world.

The Transportation Security Administration (TSA) worked with several lock manufacturers to create TSA recognized locks. These are special locks that TSA agents can bypass using master keys. To many this doesn’t sound bad. After all, the TSA tightly guards these master keys, right? Although I’m not familiar with the TSA’s internal policies regarding the management of their master keys I do know the key patterns were leaked to the Internet and 3D printer models were created shortly thereafter. And those models produce keys that work.

The keys were leaked, likely unintentionally, by a TSA agent posting a photograph of them online. With that single leak every TSA recognized lock was rendered entirely useless. Now anybody can obtain the keys to open any TSA recognized lock.

It only takes one person to leak a master key, either intentionally or unintentionally, to render every lock that key unlocks entirely useless. Leaking a compromised version of iOS could happen in many ways. The defendant’s attorney, who may not be well versed in proper security practices, could accidentally transfer the firmware to a third party in an unsecured manner. If that transfer is being monitored the person monitoring it would have a copy of the firmware. An employee of NIST could accidentally insert a USB drive with the firmware on it into an infected computer and unknowingly provide it to a malicious actor. Somebody working for the defendant’s third party verifier could intentionally leak a copy of the firmware. There are so many ways the firmware could make its way to the Internet that the question isn’t really a matter of if, but when.

Once the firmware is leaked to the Internet it would be available to anybody. While Apple could design the firmware to check the identity of the phone to guard against it working on any phone besides the one the FBI wants unlocked, it could be possible to spoof those identifies to make any iPhone 5C look like the one the FBI wants unlocked. It’s also possible that a method to disable a fully updated iPhone 5C’s signature verification will be found. If that happens a modified version of the compromised firmware, which would contain an invalid signature, that doesn’t check the phone’s identifiers could be installed.

The bottom line is that the mere existence of a compromised firmware, a master key if you will, puts every iPhone 5C at risk just as the existence of TSA master keys put everything secured with a TSA recognized lock at risk.

Brining Fascism Back To Europe

You would think Europe would have learned its lesson about fascism during World War II. Of all the nations of Europe, you would expect France to have especially learned its lesson since it suffered under the boot of Nazi Germany for quite some time. Yet, in a rather ironic twist, France is leading the way to the fascism revival on that continent:

French parliamentary deputies, defying government wishes, have voted in favour of penalising smartphone makers which fail to cooperate in terrorism inquiries, entering a controversy that has pitted the FBI against Apple in the United States.

The move came in the form of an amendment to a penal reform bill that was receiving its first reading in parliament.

Part of me appreciates France’s honesty in its pursuit of absolute power over its people. While I completely disagree with such a philosophy I do prefer an opponent who is honest about their intentions. On the other hand, an honest government is often the most terrifying kind. When the State no longer sees a need to even pay lip service to the rights of individuals it quickly begins perpetrating heinous act after heinous act.

It’ll be interesting if this bill manages to pass into law. I’m sure the French government foresees it as an effective means of compelling smartphone manufacturers to kowtow to law enforcers. But it will likely convince smartphone manufacturers to take their business elsewhere. I can’t imagine many CEOs willing to risk being kidnapped because their company’s devices used effective cryptography. Especially when there are so many other countries around the world willing to take in money making companies.

Amazon Reverses Decision On Disabling Device Encryption

As an update to last weeks’s story about Amazon disabling device encryption in Fire OS 5, the company has since reversed its decision:

Amazon will restore optional full disk encryption to Fire OS 5 in a software update “coming this spring,” according to a statement released by the company on Friday evening.

This is a good announcement but I wouldn’t buy a Fire OS device until the firmware update reenabling device encryption has been rolled out. You never know when Amazon will decide to declare backsies.

As an aside, did you notice how quickly Amazon changed its mind? If this would have been a government decision we would be sitting through years of court cases, congressional hearings, congressional votes, and other such bureaucratic nonsense. But in the market it took less than a week for customer outrage to get things changing. The market gets shit done.

Another Day, Another Attack Against Cryptography Made Possible By Government Meddling

This week another vulnerability was discovered in the OpenSSL library. The vulnerability, given the idiotic marketing name Decrypting RSA with Obsolete and Weakened eNcryption (DROWN), allows an attacker to discover a server’s TLS session keys if it has SSLv2 enabled. Like FREAK and Logjam before it, DROWN was made possible by government meddling in cryptography:

For the third time in less than a year, security researchers have found a method to attack encrypted Web communications, a direct result of weaknesses that were mandated two decades ago by the U.S. government.

These new attacks show the dangers of deliberately weakening security protocols by introducing backdoors or other access mechanisms like those that law enforcement agencies and the intelligence community are calling for today.

[…]

Dubbed DROWN, this attack can be used to decrypt TLS connections between a user and a server if that server supports the old SSL version 2 protocol or shares its private key with another server that does. The attack is possible because of a fundamental weakness in the SSLv2 protocol that also relates to export-grade cryptography.

The U.S. government deliberately weakened three kinds of cryptographic primitives in the 1990s — RSA encryption, Diffie-Hellman key exchange, and symmetric ciphers — and all three have put the security of the Internet at risk decades later, the researchers who developed DROWN said on a website that explains the attack.

We’d all be safer if the government didn’t meddle in mathematical affairs.

This exploit also shows the dangers of supporting legacy protocols. While there may exist users that have software so old it doesn’t support TLS or even SSLv3, supporting them creates a hazard to every other user. There’s a point where you have to tell that user of ancient software to either upgrade to modern software or stop using the service. From a business standpoint, potentially losing one customer due to not having legacy support is far better than losing a lot of customers due to their trust in your company being lost because of a major security compromise.

Amazon Disabled Device Encryption In Fire OS 5

While Apple and, to a lesser extent, Google are working to improve the security on their devices Amazon has decided on a different strategy:

While Apple continues to resist a court order requiring it to help the FBI access a terrorist’s phone, another major tech company just took a strange and unexpected step away from encryption.

Amazon has removed device encryption from the operating system that powers its Kindle e-reader, Fire Phone, Fire Tablet, and Fire TV devices.

The change, which took effect in Fire OS 5, affects millions of users.

Traditionally firmware updates deliver (or at least attempt to) security enhancements. I’m not sure why Amazon chose to move away from that tradition but it should cause users of Fire OS devices concern. By delivering a firmware update that removes a major security feature Amazon has violated the trust of its users.

Unless Amazon fixes this I would recommend avoiding Fire OS based devices. Fortunately other phone and table manufacturers exist and are willing to provide you devices that offer good security features.

Argh, Pirates Be A Hackin’ The High Seas

The biggest threat to computer security may be the average person’s lack of creativity. Imagine if you asked a random person on the streets what the possible ramifications of poor computer security at a shipping company could be. I would wager a bet that you’d get a lot of blank stares and variations of, “Uh, nothing.” But if you ask a creative person, say a pirate, the same question you will likely hear some pretty interesting ideas:

Tech-savvy pirates once breached the servers of a global shipping company to locate the exact vessel and cargo containers they wanted to plunder, according to a new report from Verizon’s cybersecurity team.

“They’d board the vessel, locate by bar code specific sought-after crates containing valuables, steal the contents of that crate — and that crate only — and then depart the vessel without further incident,” says the report, Verizon’s Data Breach Digest.

Just because you can’t think of a reason security is important doesn’t mean somebody else can’t. This is especially important to keep in mind if you’re one of those “I’ve got nothing to hide,” people. You might not be able to think of any reason but somebody who means you ill almost certain can.

When you’re assessing your own security, whether it be on a person or organizational level, it’s wise to bring in some outsiders, perhaps people with experience in breaching networks for malicious purposes, and pay them a little something to provide you with ideas you haven’t thought of yet. You will likely be surprised at how many things you simply failed to think of.

New York Judge Rules Feds Can’t Coerce Apple Into Unlocking An iPhone

In a rare positive judicial ruling, a judge in New York has ruled against the feds who were demanding the power to coerce Apple into unlocking an iPhone:

A US magistrate judge in New York has ruled that the government can’t force Apple to help law enforcement unlock an iPhone using the All Writs Act.

[…]

In the brief, the judge concluded that this is an issue that should be handled by congress. If the government wants to use All Writs or CALEA to force companies to circumvent encryption, there needs to a clear law granting it that power.

It should be noted that this case separate from the San Bernardino one but the ruling could give Apple’s lawyers some judicial precedence to strengthen their argument in that case.

Unfortunately, but not surprisingly, the judge rule that Congress needs to make a law to resolve this debate. What would have been better is a ruling that said the State doesn’t have the power to coerce people into performing labor against their will. Of course such a precedence would effectively invalidate the State itself so I understand why it wasn’t made.

This issue will likely continue to come up until the Supreme Court rules on it. Having the authority to coerce companies into creating backdoors is just too enticing for the feds to roll over on. That being the case, companies should start focusing their efforts on creating software and devices that they are unable to crack. If devices are effectively secured by default it won’t matter what laws are passed or what rulings are made.