Brazilian Government Unable To Break WhatsApp’s Encryption, Retaliates By Kidnapping A Facebook Employee

This may be a preview of things to come here. The Brazilian government is a bit peeved that it is unable to bypass WhatsApp’s encryption. Furthermore, it has been unable to convince Facebook, the owner of WhatsApp, to include a backdoor in the software. In what appears to be an act of retaliation the government has decided to harass Facebook by kidnapping one of its employees:

The arrest was made at the request of officials from the state of Sergipe, in Brazil’s north-east. In a statement, the federal police said Facebook/WhatsApp had repeatedly failed to comply with court orders relating to an organized crime and drug-trafficking investigation.

[…]

WhatsApp said in a statement that it was disappointed at the arrest and is unable to provide information it does not have, due to the architecture of its service. “We cooperated to the full extent of our ability in this case and while we respect the important job of law enforcement, we strongly disagree with its decision,” the unit said.

I wish companies would stop including all the nonsense about understanding the important job of law enforcement. Enforcing laws isn’t important. Providing justice to victims is important but that’s not what law enforcers primarily do.

What makes this kidnapping even weirder is that WhatsApp is apparently a separate operational entity from Facebook so the Brazilian government didn’t even kidnap a person who is in any way responsible for the app:

Facebook issued a distinct statement, noting that WhatsApp is operationally separate from the mothership, making the arrest of a Facebook exec “extreme and disproportionate.”

This is what it looks like when a government throws a temper tantrum. Hopefully the Brazilian government will release the poor schmuck it kidnapped. Although it wouldn’t surprise me (OK, it would surprise me a little bit) if it decided to threaten to kill him if Facebook didn’t give in to its demands. Either way, if I were Facebook I’d strongly consider moving all operations out of Brazil. Operating in that country has obviously become a liability.

Now Your Water Pitcher Can Be A Network Vulnerability

this-business-will-get-out-of-control

This Internet of Things will get out of control.

Everybody is rushing to either “cloud” enable their products or make it part of the Internet of things. There are countless examples of this nonsense. Now we even have water pitchers with fucking Wi-Fi capabilities:

Starting today, Brita will sell a sensor-filled, WiFi-connected Brita pitcher (yes, you read that correctly) that will work with Dash Replenishment Service.

The new pitcher, called the Brita Infinity pitcher, will be able to track how much water is flowing through the pitcher. When approximately 40 gallons of water have passed through the pitcher’s purification filter, the pitcher will then send a signal to the Dash Replenishment Service to reorder more filters.

Instead of having a watch pitcher you have to replace filters on whenever you water starts to taste funky you can have that and concerns about battery power, whether the pitcher is accurately measuring water usage and not shaving a bit off of the top to increase Brita’s profits, and network security too!

We’re at the point where we need to strongly consider separate wireless networks and VLANs for our Internet enabled devices. The utter lack of security concerns most Internet of Things manufacturers have shown so far makes these devices too dangerous to let onto our usual networks but the technology is becoming so pervasive that simply ignoring the technology will become increasingly more difficult.

Backup Locally

There is no cloud, there are only other people’s computers. This is a phrase you should have tattooed to the inside of your eyelids so you can contemplate it every night. It seems like every company is pushing people to store their data in “the cloud.” Apple has iCloud, Google has its Cloud Platform, Microsoft has Azure, and so on. While backing up to “the cloud” is convenient it also means your data is sitting on somebody else’s computer. In all likelihood that data was uploaded in plaintext as well so it’s readable to the owner of the server.

I have good news though! You don’t have to upload your data to somebody else’s computer! If you use an iPhone it’s actually very easy to make local backups:

If you’re looking for comprehensive privacy, including protection from law enforcement entities, there’s still a loophole here: iCloud. Apple encourages the use of this service on every iPhone, iPad, and iPod Touch that it sells, and when you do use the service, it backs up your device every time you plug it into its power adapter within range of a known Wi-Fi network. iCloud backups are comprehensive in a way that Android backups still aren’t, and if you’ve been following the San Bernardino case closely, you know that Apple’s own legal process guidelines (PDF) say that the company can hand iMessages, SMS/MMS messages, photos, app data, and voicemail over to law enforcement in the form of an iOS device backup (though some reports claim that Apple wants to strengthen the encryption on iCloud backups, removing the company’s ability to hand the data over to law enforcement).

For most users, this will never be a problem, and the convenience of iCloud backups and easy preservation of your data far outweigh any risks. For people who prefer full control over their data, the easiest option is to stop using iCloud and use iTunes instead. This, too, is not news, and in some ways is a regression to the days before iOS 5 when you needed to use a computer to activate, update, and back up your phone at all. But there are multiple benefits to doing local backups, so while the topic is on everyone’s mind we’ll show you how to do it (in case you don’t know) and what you get from it (in case you don’t know everything).

I backup my iPhone locally and you should too. My local backups are encrypted by iTunes and are stored on fully encrypted hard drives, which is a strategy I also encourage you to follow. Besides enhancing privacy by not making my data available to Apple and any court orders it receives this setup also prevents my data from being obtained if Apple’s iCloud servers are breached (which has happened).

iPhones aren’t the only devices that can be backed up locally. Most modern operating systems have built-in backup tools that clone data to external hard drives. These are far superior backup tools in my opinion than “cloud” backup services. If you backup to fully encrypted hard drives you ensure that your data isn’t easily accessible to unauthorized parties. And you can store some of your encrypted backup drives offsite, say at your parents’ house or place of work, to ensure everything isn’t lost if your house burns to the ground.

Don’t rely on other people’s computers.

It’s Not Just Once iPhone The FBI Wants Unlocked

There are people siding with the Federal Bureau of Investigations (FBI) in its current court battle with Apple. These misguided souls are claiming, amongst other nonsensical things, that the FBI only wants a single iPhone unlocked. They believe that it’s somehow OK for Apple to open Pandora’s box by releasing a signed firmware with a backdoor in it so long as it’s only for unlocking a single iPhone. Unfortunately, as those of us siding with Apple have been pointing out, this case isn’t about a single iPhone. The FBI wants a court precedence so it can coerce Apple into unlocking other iPhones:

In addition to the iPhone used by one of the San Bernardino shooters, the US government is pursuing court orders to force Apple to help bypass the security passcodes of “about a dozen” other iPhones, the Wall Street Journal reports. The other cases don’t involve terror charges, the Journal’s sources say, but prosecutors involved have also sought to use the same 220-year-old law — the All Writs Act of 1789 — to access the phones in question.

By setting a precedence in the San Bernardino case the FBI would have grounds to coerce Apple, and other device manufacturers, to unlock other devices. We know the FBI already has a dozen or so phones in the pipeline and it will certainly have more in the coming years.

Besides the precedence there is also the problem of the firmware itself. If Apple creates a signed firmware that disables iOS security features and automates brute forcing passwords it could be installed on other iPhones (at least other iPhone 5Cs but possibly other iPhone). With this firmware in hand the FBI wouldn’t even need to coerce Apple into helping each time, the agency could simply install the firmware on any compatible devices itself. This is why Apple believes creating such a firmware is too dangerous.

You can never believe the government when it claims to be taking an exceptional measure just once. Those exceptional measures always become standard practice.

Bill Gates Sides With The FBI

Microsoft has always enjoy a cozy relationship with the State. This isn’t surprising to anybody who has paid attention to Bill Gates and his ongoing love affair with the State. It’s also not surprising that he is siding with the Federal Bureau of Investigations (FBI) against Apple:

Technology companies should be forced to cooperate with law enforcement in terrorism investigations, Gates said, according to a Financial Times story posted late Monday.

“This is a specific case where the government is asking for access to information. They are not asking for some general thing, they are asking for a particular case,” he said.

This statement by Gates is laughable. The FBI is demanding Apple create a custom signed version of iOS that doesn’t include several security features and includes builtin software to brute force the decryption key set by the user. That is not a general thing for a particular case, that’s a general tool that can used on many iPhones.

What is funny about this though is that Bill Gates tried to backpedal but in so doing only said exactly the same thing over again:

In an interview with Bloomberg, Bill Gates says he was “disappointed” by reports that he supported the FBI in its legal battle with Apple, saying “that doesn’t state my view on this.”

Still, Gates took a more moderate stance than some of his counterparts in the tech industry, not fully backing either the FBI or Apple but calling for a broader “discussion” on the issues. “I do believe that with the right safeguards, there are cases where the government, on our behalf — like stopping terrorism, which could get worse in the future — that that is valuable.” But he called for “striking [a] balance” between safeguards against government power and security.

Any “balance” would require Apple to create firmware that includes a backdoor for government use. In other words, it would require exactly what the FBI is demanding of Apple.

Cryptography is math and math belongs to that very small category of things that are either black or white. Either the cryptography you’re using is effective and only allows authorized parties to access the unencrypted content or it is ineffective. There is no middle ground. You cannot break cryptography just a little bit.

Although the existence of a version of iOS with a backdoor is frightening in of itself, the idea that a single judge can enslave software developers by issuing a writ is terrifying. That’s an aspect of this case that is getting glossed over a lot. Apple has already publicly stated it has no desire to write a weakened version of iOS. If the court sides with the FBI it will try to force Apple to write software against its will. Why should any individual have the power to legally do that?

The Public-Private Surveillance Partnership Strike Again

As a history buff Ancestry.com has always interested me. I’d love to trace back my family lineage. But the public-private surveillance partnership has held me back.

I figured it was only a matter of time until government agents began demanding genetic records from services like Ancestry.com and 23andMe. Once again my paranoia turned out to be prophetic (not because I’m so smart but because it was so bloody obvious):

Now, five years later, when 23andMe and Ancestry both have over a million customers, those warnings are looking prescient. “Your relative’s DNA could turn you into a suspect,” warns Wired, writing about a case from earlier this year, in which New Orleans filmmaker Michael Usry became a suspect in an unsolved murder case after cops did a familial genetic search using semen collected in 1996. The cops searched an Ancestry.com database and got a familial match to a saliva sample Usry’s father had given years earlier. Usry was ultimately determined to be innocent and the Electronic Frontier Foundation called it a “wild goose chase” that demonstrated “the very real threats to privacy and civil liberties posed by law enforcement access to private genetic databases.”

[…]

Both Ancestry.com and 23andMe stipulate in their privacy policies that they will turn information over to law enforcement if served with a court order. 23andMe says it’s received a couple of requests from both state law enforcement and the FBI, but that it has “successfully resisted them.”

As a general rule I’m wary of any service that collects information the State wouldn’t normally have. I know any personal information collected on me by a service provider is a single court order away from being in the hands of the State.

This is a problem many libertarians fail to fully realize. They make a stark distinction between corporate and government surveillance and fail to realize the former becomes the latter at the whim of a judge. If it wasn’t for the State’s power to obtain private records I wouldn’t be as concerned with corporate surveillance since companies aren’t in a habit of sending armed goons to my door to shoot my dog and kidnap me.

Google Releases RCS Client. It’s Backdoored.

With the recent kerfuffle between Apple and the Federal Bureau of Investigations (FBI) the debate between secure and insecure devices is in the spotlight. Apple has been marketing itself as a company that defends users’ privacy and this recent court battle gives merits to its claims. Other companies have expressed support for Apple’s decision to fight the FBI’s demand, including Google. That makes this next twist in the story interesting.

Yesterday Christopher Soghoian posted the following Tweet:

His Tweet linked to a comment on a Hacker News thread discussing Google’s new Rich Communication Services (RCS) client, Jibe. What’s especially interesting about RCS is that it appears to include a backdoor as noted in the Hacker News thread:

When using MSRPoTLS, and with the following two objectives allow compliance with legal interception procedures, the TLS authentication shall be based on self-signed certificates and the MSRP encrypted connection shall be terminated in an element of the Service Provider network providing service to that UE. Mutual authentication shall be applied as defined in [RFC4572].

It’s important to note that this doesn’t really change anything from the current Short Message Service (SMS) service and cellular voice protocols, which offers no real security. By using this standard Google isn’t introducing a new security hole. However, Google also isn’t fixing a known security hole.

When Apple created iMessage and FaceTime it made use of strong end-to-end encryption (although that doesn’t protect your messages if you back them up to iCloud). Apple’s replacement for SMS and standard cellular calls addressed a known security hole.

Were I Google, especially with the security debate going on, I would have avoided embracing RCS since it’s insecure by default. RCS may be an industry standard, since it’s managed by the same association that manages Global System for Mobile Communications (GSM), but it’s a bad standard that shouldn’t see widespread adoption.

Legalizing Slavery

The United States has a long history of slavery. Since the very beginning of this country through the end of the Civil War black individuals could be owned as slaves in many states. After that the rules were changed. Private ownership of slaves was deemed illegal (a very good thing) but the State gave itself permission to enslave anybody it arbitrarily labeled as a criminal (a very bad thing). Eventually the process was streamlined and Federal Prison Industries (UNICOR) was created so manage the federally owned slaves. Individual states used this precedence to establish their own government owned corporations to managed their slaves.

Now a congressman is looking to change the rules yet again by expanding the State’s ability to own slaves. If passed, this bill will allow the State to enslave anybody by issuing a simple court order:

Sen. Richard Burr (R-North Carolina), the chairman of the Senate Intelligence Committee, reportedly will introduce legislation soon to criminalize a company’s refusal to aid decryption efforts as part of a governmental investigation. The news was first reported Thursday afternoon by the Wall Street Journal.

Aiding decryption efforts requires labor. In the San Bernardino case the Federal Bureau of Investigations (FBI) is order Apple to create a custom version of iOS that removes several key security features. Apple has refused and it has every right to do so because nobody should be compelled into performing labor against their will. If the FBI wants the phone unlocked so badly it can either put in the effort itself or hire somebody willing to try.

We’re living in interesting times. The State is seeing less and less reason to conceal its intentions.

Private Surveillance

Although public surveillance is more frightening to me because the consequences are generally more dire, I also don’t shy away from criticizing private surveillance. This is where I often part company with other libertarians because they often instinctively say private surveillance, because it’s voluntary, is entirely acceptable. Of course this attitude is overly simplistic. First, private surveillance often turns into public surveillance. Second, the market manipulations performed by the State have raised the consequences of private surveillance even when it doesn’t turn into public surveillance.

Consider health insurance. For most people their health insurance is tied to their employment. This practice is a holdover from World War II, where the State manipulated the market in such a way that employers had to find forms of compensation besides pay to attract employees:

There is no good reason for any of this, aside from historical accident. During World War II, federal wage controls prevented employers from wooing workers with higher pay, so companies started offering health insurance as a way around the law. Of course, this form of nonmonetary compensation is still pay. When the war ended, the practice stuck.

I doubt the long term consequences of this marriage were realized by the employers who first used health insurance as a means to attract employees. Fast forward many decades later and we have a relationship so tight that employers are surveilling their employees’ health data:

Employee wellness firms and insurers are working with companies to mine data about the prescription drugs workers use, how they shop, and even whether they vote, to predict their individual health needs and recommend treatments.

Trying to stem rising health-care costs, some companies, including retailer Wal-Mart Stores Inc., are paying firms like Castlight Healthcare Inc. to collect and crunch employee data to identify, for example, which workers are at risk for diabetes, and target them with personalized messages nudging them toward a doctor or services such as weight-loss programs.

One of the downsides of employers providing health insurance is that they front a lot of the costs. Employers, like everybody else, have an interest in keeping their costs down. Now, instead of minding their own business, employers are trying to snoop on their employees’ health care information.

Health care information is something most people see as confidential. It can reveal a lot of potentially embarrassing things about a person such as having a sexually transmitted disease or mental illness. Unless your health is preventing you from working it shouldn’t be the business of your employer and most likely wouldn’t be if your health insurance wasn’t tied to your employment status.

This is why I respect Samuel Edward Konkin III more than most libertarian philosophers. His philosophy, agorism, argue for the death of wage labor. Instead it encourages everybody to be an entrepreneur that contracts directly with others. This is a stark contrast to many libertarian philosophers who seem to encourage wage labor.

The more independent you are the more free you are. By moving away from wage labor an individual becomes more independent and therefore more free. If you’re your own employer then you are free from worries of being surveilled and possibly fired for simply being too expensive to insure.

Doublethink

In George Orwell’s Nineteen Eighty-Four doublethink is described as, “The power of holding two contradictory beliefs in one’s mind simultaneously, and accepting both of them… To tell deliberate lies while genuinely believing in them, to forget any fact that has become inconvenient, and then, when it becomes necessary again, to draw it back from oblivion for just as long as it is needed, to deny the existence of objective reality and all the while to take account of the reality which one denies – all this is indispensably necessary.” That is the most accurate term to describe the White House’s claim that what the Federal Bureau of Investigations (FBI) is demanding of Apple isn’t a back door:

The White House says a court ruling asking Apple to help the FBI access data on a phone belonging to the San Bernardino gunman does not mean asking for a “back door” to the device.

By definition a backdoor, as it pertains to security, is a purposely placed mechanism that allows an unauthorized party to bypass security measures. What the FBI is asking Apple to develop is a special version of iOS that attempts to brute force the device’s password and doesn’t contain the increasing timed lockout functionality when entering incorrect passwords or the functionality that erases the phone after 10 incorrect passwords have been entered. The FBI is asking for a backdoor.

Just because the FBI is demanding this special firmware for a specific iPhone doesn’t mean the firmware isn’t a backdoor. But through the magic of doublethink the White House is able to claim what the FBI is demanding isn’t a backdoor.