The Next Stage In 3D Printed Firearms

Proving once again that technology overcomes legal restrictions, a new stage in 3D printed firearms has been reached. Instead of a single shot pistol that’s difficult to reload we now have a 3D printed semiautomatic 9mm handgun:

Last weekend a 47-year-old West Virginia carpenter who goes by the pseudonym Derwood released the first video of what he calls the Shuty-MP1, a “mostly” 3-D printed semi-automatic firearm. Like any semi-automatic weapon, Derwood’s creation can fire an actual magazine of ammunition—in this case 9mm rounds—ejecting spent casings one by one and loading a new round into its chamber with every trigger pull. But unlike the typical steel semi-automatic rifle, Derwood says close to “95 percent” of his creation is 3-D printed in cheap PLA plastic, from its bolt to the magazine to the upper and lower receivers that make up the gun’s body.

Heres a video of it firing:

As the article notes, the gun isn’t perfect. The plastic around the barrel apparently starts to melt after firing 18 rounds if sufficient cooling time isn’t given. But the pace at which 3D printed firearms are evolving is staggering. In a few short years we’ve gone from the single shot Liberator pistol to a fully functional semiautomatic pistol. It won’t be long until practical 3D printed firearms are designed.

What does this mean? It means prohibitions against firearms are less relevant. Prohibiting something that any schmuck can make in their home isn’t possible. Alcohol prohibition and the current war on drugs have proven that.

Building A Mesh Network In New York City

One of the biggest weaknesses of today’s Internet is its reliance on centralized providers. Getting Internet access at home usually requires signing up with one of the few, if you’re even lucky to have more than one, Internet service providers (ISPs). In my area, for example, the only real options are Comcast or CenturyLink. CenturyLink only offers Digital subscriber line (DSL) services so the only actual option for me, assuming I want access speeds above 1Mbps, is Comcast. My situation isn’t unique. In fact it’s the norm.

The problem with highly centralized systems such as this are numerous, especially when you consider how cozy most ISPs are with the State. Censorship and surveillance are made much easier when a system is centralized. Instead of having to deal with a bunch of individuals to censor or surveil Internet users the State only has to make a few sweetheart deals with the handful of ISPs. Another issue with heavily centralized systems is that users are at a severe disadvantage. The entire debate surrounding net neutrality is really only an issue because so little competition exists in the Internet provision market. If Comcast wants to block access to Netflix unless I pay an additional fee there really isn’t much I can do about it.

Many consider to this nightmare proof that the market has failed. But such accusations are nonsense because the market isn’t at work here. The reason so little competition exists in the Internet provision market is because the State protects current ISPs from competition. It’s too easy for a massive regulatory entity such as the State to put its boot down on the fact of centralized service providers.

Does all this mean an uncensored, secured Internet is impossible to achieve? Not at all. The trick is to move away from easily identified centralized providers. If, for example, every Internet users was also a provider it would make it practically impossible for the State to effectively control it. That’s what mesh networks can offer and the idea is becoming more popular every day. Denizens of New York City have jumped onboard the mesh network bandwagon and are trying to make local ISPs irrelevant:

The internet may feel free, but it certainly isn’t. The only way for most people to get it is through a giant corporation like Comcast or Time Warner Cable, companies that choke your access and charge exorbitant prices.

In New York City, a group of activists and volunteers called NYC Mesh are trying to take back the internet. They’re building something called a mesh network — a makeshift system that provides internet access. Their goal is to make TWC totally irrelevant.

The hardest part about establishing a mesh network is achieving critical mass. A mesh network needs a decent number of nodes to begin being truly useful. That’s why it makes sense to start building mesh networks in very densely populated areas such as New York City. If the necessary critical mass is achieved in a few major metropolitan areas it will become feasible to bypass centralized ISPs by connecting various regional mesh networks together.

Looking at NYC Mesh’s map of active nodes it seems like they’ve already established pretty decent coverage considering the organization has only been around since January of 2014. If they can keep up this pace they could soon become a viable alternative to local centralized ISPs.

When Karma Bites You In The Ass

The National Security Agency (NSA), which is supposedly tasked with security domestic networks in addition to exploiting foreign networks, has caused a lot of damage to overall computer security. It appears one of its efforts, inserting a backdoor into the Dual Elliptic Curve Deterministic Random Bit Generation algorithm, may have bit the State in the ass:

The government may have used compromised software for up to three years, exposing national security secrets to foreign spies, according to lawmakers and security experts.

Observers increasingly believe the software defect derived from an encryption “back door” created by the National Security Agency (NSA). Foreign hackers likely repurposed it for their own snooping needs.

[…]

The software vulnerability was spotted in December, when Juniper Networks, which makes a variety of IT products widely used in government, said it had found unauthorized code in its ScreenOS product.

[…]

The case is especially frustrating to security experts because it may have been avoidable. The hackers, they say, likely benefited from a flaw in the encryption algorithm that was inserted by the NSA.

For years, the NSA was seen as the standard-bearer on security technology, with many companies relying on the agency’s algorithms to lock down data.

But some suspected the NSA algorithms, including the one Juniper used, contained built-in vulnerabilities that could be used for surveillance purposes. Documents leaked by former NSA contractor Edward Snowden in 2013 appeared to confirm those suspicions.

Karma can be a real bitch.

This story does bring up a point many people often ignore: the State relies on a great deal of commercial hardware. Its infrastructure isn’t built of custom hardware and software free of the defects agencies such as the NSA introduce into commercial products. Much of its infrastructure is built on the exact same hardware and software the rest of us use. That means, contrary to what many libertarians claim as a pathetic justification not to learn proper computer security practices, the State is just as vulnerable to many of the issues as the rest of us and is therefore not as powerful as it seems.

Everything Is Becoming A Snitch

The Internet of Things promises many wonderful benefits but the lack of security focus guarantees there will be severe detriments. A column in the New York Times inadvertently explains how dire some of these detriments could be:

WASHINGTON — For more than two years the F.B.I. and intelligence agencies have warned that encrypted communications are creating a “going dark” crisis that will keep them from tracking terrorists and kidnappers.

Now, a study in which current and former intelligence officials participated concludes that the warning is wildly overblown, and that a raft of new technologies — like television sets with microphones and web-connected cars — are creating ample opportunities for the government to track suspects, many of them worrying.

“ ‘Going dark’ does not aptly describe the long-term landscape for government surveillance,” concludes the study, to be published Monday by the Berkman Center for Internet and Society at Harvard.

The study argues that the phrase ignores the flood of new technologies “being packed with sensors and wireless connectivity” that are expected to become the subject of court orders and subpoenas, and are already the target of the National Security Agency as it places “implants” into networks around the world to monitor communications abroad.

The products, ranging from “toasters to bedsheets, light bulbs, cameras, toothbrushes, door locks, cars, watches and other wearables,” will give the government increasing opportunities to track suspects and in many cases reconstruct communications and meetings.

Encryption is only part of the electronic security puzzle. Even if your devices are properly implementing encryption to secure the data they store, transmit, or receive they may not be properly enforcing credentials. Authorized users are expected to be able to gain access to plaintext data so bypassing the security offered by encryption can be done by gaining access to an authorized user account.

Let’s consider the Amazon Echo. The Echo relies heavily on voice commands, which means it has a built-in microphone that’s always listening. Even if the data it transmits to and receives from Amazon is properly encrypted an unauthorized user who gains access to the device as an authorized user could use the microphone to record conversations. In this case cryptography hasn’t failed, the device is merely providing expected access.

Internet of Things devices, due to the lack of security focus, often fail to enforce authorization. Some devices require no authorized at all, have vulnerabilities that allow an unauthorized user to gain access to an authorized user’s account, include built-in backdoor administrative accounts with hardcoded passwords, etc. That gives the State potential access to a great deal of sensors in a targeted person’s household.

I’m not against the idea behind the Internet of Things per se. But I’m wary of such devices at the moment because the manufacturers are, in my opinion, being sloppy with security. In time I’m sure the hard lessons will be learned just as they were learned by operating system developers in the past. When that finally happens and I can be reasonably assured the security of my smart television isn’t nonexistent I may becoming more willing to buy such products.

Your Device Is A Snitch

In addition to pervasive government surveillance there is also pervasive corporate surveillance. Corporate surveillance isn’t as concerning since corporations rarely murder the people they’re surveilling but it’s also more sinister because most of the people being surveilled unwittingly agreed to be. Mobile phones are a great example of this. A lot of people, including myself, find mobile phones incredibly useful. They allow you to communicate with friends and family in almost any location, provide remote Internet connectivity, can navigate you to your destination, etc. But the side effect of the technology allows your cellular provider to know your location. In addition to that many apps use location services provide by your phone’s operating system and hardware to pinpoint your location and report it to the developers.

Another reason corporate surveillance is sinister is because the State usually has access to the collected data either through secret agreements or warrants. Your devices may report to the developer on what you’re doing and the State may then gain access to the data to prosecute you. An example of this is a recent story of a woman who filed a rape claim that was proven to be false by data collected from her Fitbit:

In March, a Florida woman traveled to Lancaster, Pennsylvania where she stayed at her boss’s home, reports ABC 27. On a Tuesday, police were called to the home where they found overturned furniture, a knife and a bottle of vodka, according to Lancaster Online. Jeannine Risley told police she’d been sleeping and that she was woken up around midnight and sexually assaulted by a “man in his 30s, wearing boots.” However, Risley was wearing her Fitbit band at the time. She initially said that the Fitbit had been lost in the struggle, but police found it in a hallway and when they downloaded its activity, the device became a witness against her.

According to ABC 27, Risley handed the username and password for her Fitbit account over to police. What they found contradicted her account of what happened that night. Via Lancaster Online:

[A] Fitbit device Risley was wearing told a different story, the affidavit shows.

The device, which monitors a person’s activity and sleep, showed Risley was awake and walking around at the time she claimed she was sleeping.

In this case one could argue that the surveillance lead to a good outcome since it busted the wearer for making a false rape accusation. But surveillance has no morality. This could very well be used to prosecute somebody who was arrested of a drug crime. For example, heart rate data from a Fitbit could be used as evidence that somebody had taken a particular drug at a certain time. It could also be used, as it was in this case, to prove the person wasn’t asleep at the time they were accused to taking drugs.

I’m not going to tell you not to use these devices. They do provide a lot of desirable functionality for many people. However, they also provide some potentially negative side effects that users should be aware of. If you use these devices just make sure you understand the ramifications.

Is Your Thermostat A Snitch

As a general rule I’m a huge fan of technology. But even I have major reservations with the so-called Internet of things (really just adding a chip to devices that were previously analog). It’s not that the ideas themselves are bad but there isn’t enough attention being paid to the implementations, especially from a security and privacy standpoint.

The Nest thermostat is one of the more popular regular household devices with a chip added to it. What’s not to like about a thermostat that automatically adjusts the temperature in your home based on when you are and aren’t there? Besides that software bug that drained the battery and caused people’s furnaces to shutdown. And the fact the bloody thing snitches on where your house is:

Researchers at Princeton University have found that, until recently, Alphabet’s popular Nest thermostat was leaking the zip code and location of its users over the internet. This data was transmitted unencrypted, or in the clear, meaning that anyone sniffing traffic could have intercepted it, according to the researchers.

The researchers also studied several other smart devices, including the Sharx security camera, a PixStar smart photoframe, and Samsung’s SmartThings Hub. The goal of their research wasn’t to find specific bugs in these devices, but to determine what information was being leaked when the devices communicated with their servers in the cloud.

I have no idea what a thermostat would need to even know where your house is. It needs to know the temperature inside and what you want the temperature to be at so it can order your climate control system to make the two numbers be the same. But it apparently does have access to that information and the developers cared so little about the privacy of their customers that they not only failed to keep the data private but didn’t even bother encrypting it when it was sent. And this isn’t an isolated incident. The complete disregard for these kind of details is plaguing the Internet of things market.

Is Your Device A Snitch

I’m convinced that one of the biggest threat to privacy is the reliance on advertisements many industries suffer from. This reliance has lead to a proliferation of surveillance technology. And now that the so-called Internet of Things (IoT) is the new hot commodity we’re seeing surveillance technology being embedded to more everyday things. With so many devices being capable of spying on you the next big thing in advertising has become cross-device surveillance. Bruce Schneier has an excellent article that shows just how far these advertisers are trying to go:

SilverPush is an Indian startup that’s trying to figure out all the different computing devices you own. It embeds inaudible sounds into the webpages you read and the television commercials you watch. Software secretly embedded in your computers, tablets, and smartphones pick up the signals, and then use cookies to transmit that information back to SilverPush. The result is that the company can track you across your different devices. It can correlate the television commercials you watch with the web searches you make. It can link the things you do on your tablet with the things you do on your work computer.

Your computerized things are talking about you behind your back, and for the most part you can’t stop them­ — or even learn what they’re saying.

Now white noise generators that broadcast on the frequencies used by this surveillance technology are suddenly good ideas for stocking stuffers. Without them your new smart fridge can display advertisements to you based on what your smart television told it you were watching.

Not only does this open the floodgates of privacy violations further but it also greatly increases the ability of malicious attackers. Ad networks have become major targets for malware distributors. This has created headaches for computer and smart phone users but now it could create headaches for your television, fridge, coffee maker, and even your damn doorbell. Making matters even worse is how unreliable IoT manufacturers are at both implementing and maintaining security. What happens when your smart fridge is considered out of date by the manufacturer and its software security problems are no longer fixed?

The reliance on advertising to fund so much technology is creating both a private and security nightmare. And it’s only getting worse.

What’s Your Score

Police, even more so than most people, tend to be lazy. And like other lazy people police are trying to replace everything with algorithms. But there is a difference between police relying on algorithms and private entities: algorithms in private hands seldom lead to people being killed. A higher death rate is the only outcome I can see coming from this:

FRESNO, Calif. — While officers raced to a recent 911 call about a man threatening his ex-girlfriend, a police operator in headquarters consulted software that scored the suspect’s potential for violence the way a bank might run a credit report.

The program scoured billions of data points, including arrest reports, property records, commercial databases, deep Web searches and the man’s social- media postings. It calculated his threat level as the highest of three color-coded scores: a bright red warning.

Algorithms that try to model human behavior are notoriously unreliable. Part of this is due to humanity’s lack of homogeneity and part of it is due to data limitations. An algorithm is only as good as the data it is fed. What data is fed into an algorithm is determined by the developers, which means the results often reflect their biases. In this case if the developers viewed gun owners as being prone to violence the algorithm would end up reflecting that.

Usually we don’t pay much attention when an algorithm screws up and recommends a product to us based on our previous purchasing history that we have no interest in. But an algorithm that tries to estimate a person’s threat level to police is going to carry much more dire consequences. There is already a chronic problem with police being too trigger happy. Imagine how much more trigger happy your average cop would be if they were told the suspect is rated high by the threat assessment algorithm. Chances are the officer will go for a shoot first and ask questions later approach.

Theoretically this type of algorithm wouldn’t have to result in such severe consequences but it is being utilized by individuals who are generally not held accountable for their actions. If an officer, for example, received notification that the suspect was rated is highly likely to be violent but knew gunning them down without cause would result in charges they would likely act more cautiously but still not resort to shooting without justification. But that’s not how things are this is will likely end badly for anybody facing off with an officer employed by a department that utilizes this system.

If You Don’t Want To Be Treated Like A Criminal Don’t Buy A Blackberry

I know what you’re thinking, you weren’t planning to buy a Blackberry anyways. The company is so far behind the technological curve that it has become almost entirely irrelevant. But I know two people who purchased Blackberry phones within the last five years so I assume there may be a few other people who have been using the platform for ages and want to continue doing so. For them this post is a warning. Don’t buy a Blackberry unless you want to be treated like a criminal:

John Chen, the Blackberry chairman and CEO, is ripping Apple’s position that granting the authorities access to a suspected criminal’s mobile device would “tarnish” the iPhone maker’s image.

“We are indeed in a dark place when companies put their reputations above the greater good. At BlackBerry, we understand, arguably more than any other large tech company, the importance of our privacy commitment to product success and brand value: privacy and security form the crux of everything we do. However, our privacy commitment does not extend to criminals,” Chen wrote in a blog post titled “The encryption Debate: a Way Forward.”

What Apple has promised customers is it is unable to gain access to user data under any circumstances. In other words Apple is promising users that it utilizes cryptography that isn’t compromised in such a way to allow a third party access. Blackberry, on the other hand, is stating it will cooperate with law enforcement requests for user data. To do that it must utilize cryptography that is compromised in such a way to allow third party access. Such a scheme, if used under the auspices of giving law enforcers access to criminal data, necessarily treats all users as potential criminals.

Furthermore, what is the “greater good”? That’s such a nonsensical term. It requires the person uttering it to be so egotistical that they believe they know what’s best for everybody. I doubt anybody has knowledge so perfect that they know what is best for all seven billion people on this planet. Realistically it’s just a euphemism for what is best for the State, which is always at odds with what is best for the individual.

You don’t have to take my word for it though. The people have a voice in this matter through the market. Anybody who truly believes Apple is being detrimental to society by not cooperating with law enforcers can buy a Blackberry device. Something tells me this statement by Chen isn’t going to cause an uptick in Blackberry sales. If anything it will likely cause a drop (if it’s even possible for Blackberry sales to drop any lower) since most people don’t seem overly enthusiastic about being spied on.

Lightbulbs With DRM Are Here

There’s a lot of love about this crazy future we live in but there are also some downright bizarre things. For example, how many of you thought your lightbulbs need some kind of mechanism to lock you into a particular manufacturer’s bulbs? Through the wonderful world of ZigBee-enabled bulbs Philips has made your dream a reality:

Philips just released firmware for the Philips Hue bridge that may permanently sever access to any “non-approved” ZigBee bulbs. We previously covered third party support in January 2015, when Philips indicated it was not blocked – and have since benefited.

The recent change seems to suggest any non-Philips bulbs from manufacturers such as Cree, GE, and Osram will not be supported in many situations, whereas “Friends of Hue” branded product are. At the time of publication, it’s unclear whether 3rd party bulbs will stop working immediately after the firmware update or if they may only become inaccessible after the bridge is reset. We’re also not sure if being “reset” means rebooted or factory reset. This appears to apply to both the round v1 bridge and square v2 HomeKit-compatible bridge after the latest firmware update is applied.

I’m not going to be a cranky curmudgeon and bitch about lightbulbs with new functionality. But I will bitch about how companies utilize new technology as a means of baiting and switching. Philips originally stated it would support third-party bulbs. I’m guessing the reason behind that was so it didn’t have to foot the entire bill to encourage adoption of ZigBee-enabled bulbs. Now it has changed the rules and locked out third-party manufacturers. In all likelihood this is because ZibBee-enabled bulbs are now sufficiently popular that Philips wants to enjoy all of the profits. It wouldn’t surprise me if somebody at Philips also assumed owners of third-party bulbs would rather purchase Philips’ hardware than lose the functionality offered by ZigBee-enabled bulbs.

There is an important lesson here. Never be entirely reliant on a third-party for your business. If, for example, you are utilizing a third-party’s software package for your hardware you should have an alternative standing buy in case you’re locked out. Were I one of these third-party manufacturers I would release an open source client on GitHub that works with any ZigBee-enabled bulb.