Archive for the ‘Technology’ Category
Whenever I discuss secure communications I try to hammer home the difference between confidentiality and anonymity. Most popular secure communication services such as Signal and WhatsApp provide the former but not the latter. This means unauthorized users cannot read the communications but they can find out which parties are communicating.
Another thing I try to hammer home is that not all forms of anonymity are equal. Several services are claiming to offer anonymous communications. These services don’t claim to offer confidentiality, the posts are public, but they do claim to conceal your identity. However, they tend to use a loose definition of anonymity:
On Sunday, a North Carolina man named Garrett Grimsley made a public post on Whisper that sounded an awful lot like a threat. “Salam, some of you are alright,” the message read, “don’t go to [Raleigh suburb] Cary tomorrow.”
When one user asked for more information, Grimsley (who is white) responded with more Islamic terms. “For too long the the kuffar have spit in our faces and trampled our rights,” he wrote. “This cannot continue. I cannot speak of anything. Say your dua, sleep, and watch the news tomorrow.”
Within 24 hours, Grimsley was in jail. Tipped off by the user who responded, police ordered Whisper to hand over all IP addresses linked to the account. When the company complied, the IP address led them to Time Warner, Grimsley’s ISP, which then provided Grimsley’s address.
There’s a great deal of difference between anonymity as it pertains to other users and anonymity as it pertains to service providers. Whisper’s definition of anonymity is that users of the service can’t identify other users. Whisper itself can identify users. This is different than a Tor hidden service where the user can’t identify the service provider and the service provider can’t identify the user.
When you’re looking at communication services make sure you understand what is actually being offered before relying on it.
When you purchase a computer do you own it? What about your cell phone? Or your automobile? At one time the answer to these questions was an absolute yes. Today, not so much:
Cars, refrigerators, televisions, Barbie dolls. When people buy these everyday objects, they rarely give much thought to whether or not they own them. We pay for them, so we think of them as our property. And historically, with the exception of the occasional lease or rental, we owned our personal possessions. They were ours to use as we saw fit. They were free to be shared, resold, modified, or repaired. That expectation is a deeply held one. When manufacturers tried to leverage the DMCA to control how we used our printers and garage door openers, a big reason courts pushed back was that the effort was so unexpected, so out of step with our understanding of our relationship to the things we buy.
But in the decade or so that followed those first bumbling attempts, we’ve witnessed a subtler and more effective strategy for convincing people to cede control over everyday purchases. It relies less—or at least less obviously—on DRM and the threat of DMCA liability, and more on the appeal of new product features, and in particular those found in the smart devices that make up the so-called Internet of Things (IoT).
I’ve annoyed many electrons criticizing the concept of intellectual property. The idea that somebody has a government granted monopoly on something simply because they were the first to receive a patent is absurd in my opinion. But we live with much more absurd ideas today. Due to the way software copyright and patent laws work, if a company loads software onto a device they can effectively prevent anybody from owning it. At most a buyer can acquire a limited use license for those devices.
Combining software copyright and patent laws with the Internet of Things (IoT) just amplifies this. Now there are a bunch of devices on the market that rely on continuous Internet access to the manufacturers’ servers. If the manufacture decides to drop support for the product it stops working. This wouldn’t be as big of an issue if laws such as the Digital Millennium Copyright Act (DMCA) didn’t make it illegal for you to hack the device and load your own software onto it that allowed it to continue working.
Right now we’re dealing with relatively cheap IoT devices. If your $99 Internet connected thermostat stops working it sucks but it’s not something that is so expensive that it can’t be replaced. But what happens when IoT comes to, say, automobiles? What happens when critical functions on an automobile cease to work because the manufacturer decides to drop support for one of the Internet connected components. Suddenly you’re not talking about throwing away a $99 device but a machine that cost you tens of thousands of dollars. Although this scenario might sound absurd to some I guarantee that it will happen at some point if software copyright and patent laws continue to be enforced as they have been.
People don’t appreciate how awesome the future we live in today really is. Compare the life you live with the life lived by some of history’s wealthiest people:
If you were a 1916 American billionaire you could, of course, afford prime real-estate. You could afford a home on 5th Avenue or one overlooking the Pacific Ocean or one on your own tropical island somewhere (or all three). But when you traveled from your Manhattan digs to your west-coast palace, it would take a few days, and if you made that trip during the summer months, you’d likely not have air-conditioning in your private railroad car.
And while you might have air-conditioning in your New York home, many of the friends’ homes that you visit — as well as restaurants and business offices that you frequent — were not air-conditioned. In the winter, many were also poorly heated by today’s standards.
To travel to Europe took you several days. To get to foreign lands beyond Europe took you even longer.
Might you want to deliver a package or letter overnight from New York City to someone in Los Angeles? Sorry. Impossible.
You could neither listen to radio (the first commercial radio broadcast occurred in 1920) nor watch television. You could, however, afford the state-of-the-art phonograph of the era. (It wasn’t stereo, though. And — I feel certain — even today’s vinylphiles would prefer listening to music played off of a modern compact disc to listening to music played off of a 1916 phonograph record.) Obviously, you could not download music.”
While I spend a lot of time complaining about horrors statism has wrought upon us, we do live better today than anybody did in any point of history thanks to the wonders of the market. And since technology is cumulative the rate of advancement is even more rapid, which means our lives are improving faster than the lives of people in the past. For example, in my fairly short lifetime home Internet access went from nonexistent to dial-up to fiber directly into the home. The computing power available in my phone wasn’t available to the consumer market for any price when I was young. Even simple toys, such as Nerf guns, improve a lot since my childhood. Kids today have electrically powered fully automatic Nerf guns, something young me could only dream of. Although various diseases such as cancer are still a scourge our chances of surviving it have increased significantly.
While there’s a lot of terrible things going on in this world don’t forget that our present is an overall great time to be alive.
The privacy-surveillance arms race will likely be waged eternally. The State wants to spy on people so it can better expropriate their wealth. Private companies want to spy on people so they can collect data to better serve them and better target ads at them. The State wants the private companies to spy on their users because it can get that information via a subpoena. Meanwhile, users are stuck being constantly watched.
Browser fingerprinting is one of the more effective tools in the private companies’ arsenal. Without having to store data on users’ systems, private companies are able to use the data surrendered by browsers to track users with a surprising degree of accuracy. But fingerprinting has been limited to individual browsers. If a user switches browsers their old fingerprint is no longer valid… until now:
The new technique relies on code that instructs browsers to perform a variety of tasks. Those tasks, in turn, draw on operating-system and hardware resources—including graphics cards, multiple CPU cores, audio cards, and installed fonts—that are slightly different for each computer. For instance, the cross-browser fingerprinting carries out 20 carefully selected tasks that use the WebGL standard for rendering 3D graphics in browsers. In all, 36 new features work independently of a specific browser.
New browser features are commonly used for tracking users. In time those features are usually improved in such a way that tracking becomes more difficult. I have no doubts that WebGL will follow this path as well. Until it is improved through, it wouldn’t be dumb to disable it if you’re trying to avoid being tracked.
Customs in the United States have become nosier every year. It makes one wonder how they can enter the country without surrendering their life by granting access to their digital devices. Wired put together a decent guide for dealing with customs. Of the tips there is one that I highly recommend:
Make a Travel Kit
For the most vulnerable travelers, the best way to keep customs away from your data is simply not to carry it. Instead, like Lackey, set up travel devices that store the minimum of sensitive data. Don’t link those “dirty” devices to your personal accounts, and when you do have to create a linked account—as with iTunes for iOS devices—create fresh ones with unique usernames and passwords. “If they ask for access and you can’t refuse, you want to be able to give them access without losing any sensitive information,” says Lackey.
Social media accounts, admittedly, can’t be so easily ditched. Some security experts recommend creating secondary personas that can be offered up to customs officials while keeping a more sensitive account secret. But if CBP agents do link your identity with an account you tried to hide, the result could be longer detention and, for non-citizens, even denial of entry.
I believe that I first came across this advice on Bruce Schneier’s blog. Instead of traveling with a device that contains all of your information you should consider traveling with a completely clean device and accessing the information you need via a Virtual Private Network (VPN) when you reach your destination. When you’re ready to return home wipe all of the data.
The most effective way to defend against the snoops at the border is to not have any data for them to snoop.
The other tips are good to follow as well but aren’t as effective as simply not having any data in the first place. But I understand that isn’t always feasible. In cases where you’re traveling somewhere that has unreliable Internet connectivity, for example, you will need to bring the data you need with you. If you’re in such a situation I recommend only brining the data you absolutely need.
Is your vehicle a snitch? If you have a modern vehicle, especially one with Internet connectivity, the answer is almost certainly yes:
One of the more recent examples can be found in a 2014 warrant that allowed New York police to trace a vehicle by demanding the satellite radio and telematics provider SiriusXM provide location information. The warrant, originally filed in 2014 but only recently unsealed (and published below in full), asked SiriusXM “to activate and monitor as a tracking device the SIRIUS XM Satellite Radio installed on the Target Vehicle for a period of 10 days.” The target was a Toyota 4-Runner wrapped up in an alleged illegal gambling enterprise.
So it was that in December 2009 police asked GM to cough up OnStar data from a Chevrolet Tahoe rented by a suspected crack cocaine dealer Riley Dantzler. The cops who were after Dantzler had no idea what the car looked like or where it was, but with OnStar tracking they could follow him from Houston, Texas, to Ouchita Parish, Louisiana. OnStar’s tracking was accurate too, a court document revealing it was able to “identify that vehicle among the many that were on Interstate 20 that evening.” They stopped Dantzler and found cocaine, ecstasy and a gun inside.
In at least two cases, individuals unwittingly had their conversations listened in on by law enforcement. In 2001, OnStar competitor ATX Technologies (which later became part of Agero) was ordered to provide “roving interceptions” of a Mercedes Benz S430V. It initially complied with the order in November of that year to spy on audible communications for 30 days, but when the FBI asked for an extension in December, ATX declined, claiming it was overly burdensome. (The filing on the FBI’s attempt to find ATX in contempt of court is also published below).
As a quick aside, it should also be noted that the cell phone you carry around contains the hardware necessary to perform these same forms of surveillance. So don’t start bragging about the old vehicle you drive if you’re carrying around a cell phone.
There are two major problems here. The first problem is technological and the second is statism. There’s nothing wrong with adding more technological capabilities to a vehicle. However, much like the Internet of Things, automobile manufacturers have a terrible track record when it comes to computer security. For example, having a builtin communication system like OnStar isn’t bad in of itself but when it can be remotely activated a lot of security questions come into play.
The second problem is statism. Monitoring technologies that can be remotely activated are dangerous in general but become even more dangerous in the hands of the State. As this story demonstrated, the combination of remotely activated microphones and statism leads to men with guns kidnapping people (or possibly worse).
Everything in this story is just the tip of the iceberg though. As more technology is integrated into automobiles the State will also integrate itself more. I have no doubt that at some point a law will be passed that will require all automobiles to have a remotely activated kill switch. It’ll likely be proposed shortly after a high speed chase that ends in an officer getting killed and will be sold to the public as necessary for protecting the lives of our heroes in blue. As self-driving cars become more popular there will likely be a law passed that requires self-driving cars to have a remotely accessible autopilot mode so police can command a car to pull over for a stop or drive to the courthouse if somebody is missing their court date.
Everything that could be amazing will end up being shit because the State will decided to meddle. The State is why we can’t have nice things.
Just as the Austrian school of economics has a business cycle I have a data cycle. The Public Private Data Cycle (catchier web 3.0 buzzword compliant name coming later) states that all privately held data becomes government data with a subpoena and all government data becomes privately held data with a leak.
The Public Private Data Cycle is important to note whenever somebody discusses keeping data on individuals. For example, many libertarians don’t worry much about the data Facebook collects because Facebook is a private company. The very same people will flip out whenever the government wants to collect more data though. Likewise, many statists don’t worry much about the data the government collects because the government is a public entity. The very same people will flip out whenever Facebook wants to collect more data though. Both of these groups have a major misunderstanding about how data access works.
I’ve presented several cases on this blog illustrating how privately held data became government data with a subpoena. But what about government data becoming privately held data? The State of California recently provided us with such an example:
Our reader Tom emailed me after he had been notified by the state of California that his personal information had been compromised as a result of a California Public Records Act. Based on the limited information that we have at this time, it appears that names, the instructor’s date of birth, the instructor California driver’s license number and/or their California ID card number.
When Tom reached out to the CA DOJ he was informed that the entire list of firearms trainers in California had been released in the public records act request. The state of California is sending letters to those affected with the promise of 12 months or identity protection, but if you are a CA firearms instructor and haven’t seen a letter, might bee a good idea to call the DOJ to see if you were affected.
This wasn’t a case of a malicious hacker gaining access to California’s database. The state accidentally handed out this data in response to a public records request. Now that government held data about firearm instructors is privately held by an unknown party. Sure, the State of California said it ordered the recipient to destroy the data but as we all know once data has be accessed by an unauthorized party there’s no way to control it.
If data exists then the chances of it being accessed by an unauthorized party increases from zero. That’s why everybody should be wary of any attempt by anybody to collect more data on individuals.
People seem to misunderstand the Health Insurance Portability and Accountability (HIPPA) Act. I often hear people citing HIPPA as proof that their medical data is private. However, misunderstandings aren’t reality. Your medical data isn’t private. In fact, it’s for sale:
Your medical data is for sale – all of it. Adam Tanner, a fellow at Harvard’s institute for quantitative social science and author of a new book on the topic, Our Bodies, Our Data, said that patients generally don’t know that their most personal information – what diseases they test positive for, what surgeries they have had – is the stuff of multibillion-dollar business.
The trick is that the data is “anonymized” before it is sold. I used quotation marks in that case because anonymized can mean different things to different people. To me, anonymized means the data has been scrubbed in such a way that it cannot be tied to any individual. This is a very difficult standard to meet though. To others, such as those who are selling your medical data, anonymized simply means replacing the name, address, and phone number of a patient with an identifier. But simply removing a few identifiers doesn’t cut it in the age of big data:
But other forms of data, such as information from fitness devices and search engines, are completely unregulated and have identities and addresses attached. A third kind of data called “predictive analytics” cross-references the other two and makes predictions about behavior with what Tanner calls “a surprising degree of accuracy”.
None of this technically violates the health insurance portability and accountability act, or Hipaa, Tanner writes. But the techniques do render the protections of Hipaa largely toothless. “Data scientists can now circumvent Hipaa’s privacy protections by making very sophisticated guesses, marrying anonymized patient dossiers with named consumer profiles available elsewhere – with a surprising degree of accuracy,” says the study.
With the vast amount of data available about everybody it’s not as difficult to identify who “anonymized” data applies to as most people think.
HIPPA was written by an organization that hates privacy so it’s not surprising to see that the law failed to protect anybody’s privacy. This is also the why legislation won’t fix this problem. The only way to fix this problem is to either incentivize medical professionals to keep patient data confidential or to give exclusive control of a patient’s data to that patient.
The media’s portrayal of hackers is never accurate but almost always amusing. From hooded figures stooping over keyboards and looking at green ones and zeros on a black screen to balaclava clad individuals holding a laptop in one hand while they furiously type with the other hand, the creative minds behind the scenes at major media outlets always have a way to make hackers appear far more sinister than they really are.
CNN recently aired a segment about Russian hackers. How did the creative minds at CNN portray hackers to the viewing public? By showing a mini-game from a game you may have heard of:
In a recent story about President Obama proposing sanctions against Russia for its role in cyberattacks targeting the United States, CNN grabbed a screenshot of the hacking mini-game from the extremely popular RPG Fallout 4. First spotted by Reddit, the screenshot shows the menacing neon green letters that gamers will instantly recognize as being from the game.
Personally, I would have lifted a screenshot from the hacking mini-game in Deus Ex, it looks far more futuristic.
A lot of electrons have been annoyed by all of the people flipping out about fake news. But almost no attention has been paid to uninformed news. Most major media outlets are woefully uninformed about many (most?) of the subjects they report on. If you know anything about guns or technology you’re familiar with the amount of inaccurate reporting that occurs because of the media’s lack of understanding. When the outlet reporting on a subject doesn’t know anything about the subject the information they provide is worthless. Why aren’t people flipping out about that?
Voice activated assistances such as the Amazon Echo and Google Home are becoming popular household devices. With a simple voice command these devices can allow you to do anything from turning on your smart lightbulbs to playing music. However, any voice activated device must necessarily be listening at all times and law enforcers know that:
Amazon’s Echo devices and its virtual assistant are meant to help find answers by listening for your voice commands. However, police in Arkansas want to know if one of the gadgets overheard something that can help with a murder case. According to The Information, authorities in Bentonville issued a warrant for Amazon to hand over any audio or records from an Echo belonging to James Andrew Bates. Bates is set to go to trial for first-degree murder for the death of Victor Collins next year.
Amazon declined to give police any of the information that the Echo logged on its servers, but it did hand over Bates’ account details and purchases. Police say they were able to pull data off of the speaker, but it’s unclear what info they were able to access.
While Amazon declined to provide any server side information logged by the Echo there’s no reason a court order couldn’t compel Amazon to provide such information. In addition to that, law enforcers also managed to pull some unknown data locally from the Echo. Those two points raise questions about what kind of information devices like the Echo and Home collect as they’re passively sitting on your counter awaiting your command.
As with much of the Internet of Things, I haven’t purchased one of these voice activated assistances yet and have no plans to buy one anytime in the near future. They’re too big of a privacy risk for my tastes since I don’t even know what kind of information they’re collecting as they sit there listening.