A Geek With Guns

Chronicling the depravities of the State.

Archive for February, 2016

ATF Says Certain Medical Patients Prohibited From Owing Firearms

with one comment

Should people who require certain medications lose the right to self-defense? According to the Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF) they should:

What has forged this quirky convergence of advocacy — tokers, meet shooters — is a September letter from the federal Bureau of Alcohol, Tobacco, Firearms and Explosives saying it is illegal for medical-marijuana patients to own firearms.

Everybody who buys a gun must fill out ATF Form 4473, which asks: “Are you an unlawful user of, or addicted to, marijuana or any depressant, stimulant, narcotic drug, or any other controlled substance?”

Answer yes, and you don’t get the gun. Falsely answer no, and you’ve just committed a crime.

The ATF’s letter, sent out Sept. 21, clarifies that the bureau includes medical-marijuana patients in that group of prohibited buyers because their marijuana use is inherently illegal federally.

The absurdity, of course, is that the 4473 form asks if you are an unlawful user. People who have a medical exemption card are lawfully using cannabis and therefore should not be prohibited by law.

More importantly though, the fact that somebody can lose the right to defend themselves because they need cannabis is ridiculous. Cannabis is far safer than most other drugs including alcohol (which you can use and still legally own a firearm), which is responsible for a great deal of poor life choices.

There’s no valid reason to prohibit somebody from owning firearms just because they use certain drugs. So long as people don’t use their firearms while under the influence of drugs there is no real danger. And many drugs have no side effects that make firearm usage dangerous to the users or bystanders.

This is yet another example of a policy put forth by the ATF that demonstrates the agency is interested in restricting firearm ownership.

Written by Christopher Burg

February 29th, 2016 at 10:30 am

Monday Metal: Sonsii By Nine Treasures

without comments

For this week I’ve selected some metal from Mongolia. The band is Nine Treasures and it’s a Mongolian folk metal band:

Written by Christopher Burg

February 29th, 2016 at 10:00 am

Posted in Media

Tagged with

When Idiots Write About Computer Security

with one comment

People trying to justify the Federal Bureau of Investigation’s (FBI) demands of Apple are possibly the most amusing thing about the agency’s recent battle with Apple. Siding with the FBI requires either being completely ignorant of security or being so worshipful of the State that you believe any compromise made in the name empowering it is justified.

A friend of mine posted an article that tries to justify the FBI’s demands by claiming Apple is spreading fear, uncertainty, and disinformation (FUD). Ironically, the article is FUD. In fact it’s quite clear that the author has little to no understanding of security:

In its campaign, Apple is mustering all the fear, uncertainty and doubt it can. In an open letter to its customers, it states that “the government would have us write an entirely new operating system for their use. They are asking Apple to remove security features and add a new ability to the operating system to attack iPhone encryption, allowing a passcode to be input electronically. … It would be wrong to intentionally weaken our products with a government-ordered backdoor.” The FUD factor in that statement is “weaken our products.” It is grossly misleading, the plural suggesting that the FBI wants Apple to make this back door a standard part of iPhones. That’s flat-out false. What the government has asked is that Apple modify software to remove a feature that was not present in earlier versions of the software, and then install that new software on the single phone used by the terrorist. Apple can then destroy the software.

Apple’s statement is entirely accurate. The FBI is demanding a signed version of iOS that removes security features and includes a mechanism to brute force the password used to encrypt the contents of the device. Because the firmware would be signed it could be loaded onto other iPhones. We also know the FBI has about a dozen more phones it wants Apple to unlock so this case isn’t about a single phone. This case is about setting a precedence that will make it easier for the State to coerce companies into bypassing the security features of their own products.

The claim that Apple can destroy the software is also naive. In order to unlock the device the software must be loaded onto the phone. Since the phone is evidence it must be returned to the FBI. That means the FBI will have a signed copy of the custom firmware sitting on the phone and the phone will be unlocked so it would be feasible for the FBI to extract the firmware. Furthermore, the process involved in writing software for a court case will likely involve several third parties receiving access to the firmware:

Once the tool is ready, it must be tested and validated by a third party. In this case, it would be NIST/NIJ (which is where my own tools were validated). NIST has a mobile forensics testing and validation process by which Apple would need to provide a copy of the tool (which would have to work on all of their test devices) for NIST to verify.

[…]

During trial, the court will want to see what kind of scientific peer review the tool has had; if it is not validated by NIST or some other third party, or has no acceptance in the scientific community, the tool and any evidence gathered by it could be rejected.

[…]

If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.

It will likely be impossible for Apple to maintain exclusive control over the firmware.

Once the genie is out of the bottle it can’t be put back in. This is especially true with software since it can be reproduced almost infinitely for costs so small they’re practically free. If Apple produces this firmware it will not be able to make it not exist afterward. Let’s continue with the article in question:

More contradictory to Apple’s claims is that the FBI has specifically stated that it does not intend to cause a weakening of the consumer product, so this case cannot be used as a precedent. Should the government at any time attempt to do that so that back doors to be embedded in products, its own words would be the most compelling argument to counter that.

The FBI claims a lot of things. That doesn’t make those claims true. By merely existing this firmware would make consumer products less secure. Currently the iPhone’s security is quite strong as noted by the fact that the FBI has been unable to break into about a dozen phones in its possession. If Apple releases a firmware that can bypass security features on iPhones it necessarily means the overall security of iPhones, which are consumer products, is weakened. There is no way to logically argue otherwise. When something that couldn’t be broken into can be broken into it is less secure than it was. The fact that I felt the need to write the previous sentence causes me great pain because it speaks so ill of the education of the author.

The FUD continues, with Apple saying, “Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case.” That might very well be the case. But it has zero relevance. Each of those cases could be resolved only with a court order of its own, regardless of what happens with the San Bernardino iPhone. Even if this case were not in front of the court at the moment, any state, local or federal law enforcement agency could bring a similar case forward.

Actually, it’s entirely relevant. The FBI wants the court precedence so prosecutors in other cases can compel companies to bypass security features on their products. Apple isn’t simply fighting the creation of a purposely broken firmware, it’s fighting a precedence that would allow other courts to coerce companies into performing labor against their will. Obviously the author’s understanding of the legal system, specifically how precedence works, is as lacking as his understanding of security.

Gaining access to locked data is a legitimate law enforcement issue, and whatever your personal beliefs, all law enforcement officers have a responsibility to attempt to collect all information that is legally possible to collect.

While law enforcers may have a responsibility to attempt to collect all information within their power to collect that doesn’t mean they should be able to compel others to assist them at the point of a gun.

In other forums, Apple has been claiming that if the U.S. requires Apple to cooperate in providing access to the phone, all other governments around the world will then expect the same sort of cooperation. It is a bogus claim — more FUD. Do Apple’s lawyers really not know that the law of one country does not apply to another? Apple’s winning its case in the U.S. would do nothing to stop another country from initiating a similar action. Its losing its case should have no influence on whether other countries decide to pursue such matters.

I see the author doesn’t pay attention to world events. Oftentimes when a government sees another government get away with something nasty it decides it can also get away with it. Take Blackberry, for example. India demanded that Blackberry give it access to a backdoor and Blackberry complied. Seeing India getting what it wanted the government of Pakistan demanded the same. Monkey see, monkey do. It should be noted that Blackberry actually left Pakistan but it was obviously for reasons other than the backdoor demands.

Apple knows that if it rolls over it will encourage other governments to demand the same as the FBI. If, however, it digs its heels in it knows that it will discourage other governments from demanding the same. This is the same principle as not negotiating with terrorists. If you give in once it will encourage others to pull the same shit against you.

But of all of Apple’s arguments, the one that is most ludicrous, or perhaps the most damning of its much-touted security prowess, is revealed in this response to the government’s request for a key that could unlock one phone:

“Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.”

First, Apple is already relentlessly attacked by hackers and criminals. I would like to hope that Apple has better security practices than the IRS. But when you unpack this statement, you are left with the impression that we should not trust any of Apple’s software or products. You have to assume that, should Apple write the software that the FBI wants, it would be among the most protected software in the company. If Apple is concerned about this software being compromised, what does that say about all of its other software?

This is another claim that can only be made by somebody who doesn’t understand security. This firmware wouldn’t be entirely in Apple’s hands. As noted above, the FBI would possess a phone with the firmware installed on it. And anybody who has paid attention to the various congressional hearings on the numerous federal network breaches knows the federal government’s network is incapable of protecting anything of value.

This firmware isn’t like a private key, which can serve its purpose even if you keep it within your exclusive control. It’s a piece of software that must be loaded onto a device that is evidence in a crime, which necessarily means it must leave your exclusive control. So Apple’s security isn’t the only cause for concern here.

Even assuming that a bad guy gets hold of just the software that law enforcement wants created, it would have to be signed by Apple’s security certificate to load on any phone.

Which the copy on the phone and any copies sent out for independent testing would be.

If the criminal gets a copy of the software and it has already been signed with the certificate, Apple could revoke the certificate.

If the author read the Electronic Frontier Foundation’s (EFF) excellent technical overview of this case he would know that the public key is built into the hardware of the iPhone. This is actually a smart security practice because it prevents malware from replacing the public key. If the public key was replaced it would allow malware to load its own code. The downside to this is that Apple can’t revoke the public key to prevent software signed with the corresponding private key from loading.

But if a bad guy gets hold of Apple’s digital certificate, then the whole Apple software base is at risk, and this feature that the FBI wants bypassed is irrelevant. After all, Apple has stated that it is not immune from attack, and it has implied it is a reasonable concern that its most protected software can be compromised.

I’m going to take this opportunity to write about a specific feature of public key cryptography that is relevant here. Public key cryptography relies on two keys: a private key and a public key. The private key, as the name implies, can be kept private. Anything signed with the private key can be verified by the public key. Because of this you only need to hand out the public key.

I have a Pretty Good Privacy (PGP) key that I use to encrypt and sign e-mails. Anybody with my public key can validate my signature but they cannot sign an e-mail as me. If, however, they had my private key they could sign e-mails as me. Because of this I keep my private key very secure. Apple likely keeps its software signing key in a vault on storage media that is only ever connected to a secure computer that has no network connectivity. Under such circumstances an attacker with access to Apple’s network would still be unable to access the company’s software signing key. For reasons I stated earlier, that’s not a model Apple can follow with the firmware the FBI is demanding. Apple’s security concerns in this case are entirely unrelated to the security practices of its private key.

In addition to his technical incompetence, the author decided to display his argumentative incompetence by closing his article with a pretty pathetic ad hominid:

But Apple, seeming to take a page from Donald Trump’s presidential campaign, is using the situation to promote its brand with free advertising.

If all else fails in your argument just compare your opponent to Trump.

Written by Christopher Burg

February 26th, 2016 at 11:00 am

Apple Gives The Feds Another Middle Finger

with one comment


Me right now.

A lot of people are claiming Apple’s decision to fight the Federal Bureau of Investigations (FBI) is nothing more than a marketing play. But I swear that I can hear Tim Cook yelling, “Fuck the police!” because his company keeps making announcements that it’s going to make its products more secure:

WASHINGTON — Apple engineers have begun developing new security measures that would make it impossible for the government to break into a locked iPhone using methods similar to those now at the center of a court fight in California, according to people close to the company and security experts.

[…]

The company first raised the prospect of a security update last week in a phone call with reporters, who asked why the company would allow firmware — the software at the heart of the iPhone — to be modified without requiring a user password.

One senior executive, speaking on the condition of anonymity, replied that it was safe to bet that security would continue to improve. Separately, a person close to the company, who also spoke on the condition of anonymity, confirmed this week that Apple engineers had begun work on a solution even before the San Bernardino attack. A company spokeswoman declined to comment on what she called rumors and speculation.

Independent experts say they have held informal conversations with Apple engineers over the last week about the vulnerability. Exactly how Apple will address the issue is unclear. Security experts who have been studying Apple’s phone security say it is technically possible to fix.

In addition to senior executives talking about upcoming security enhancements, Apple has also added an interesting figure to its payroll:

Frederic Jacobs, for those who don’t know, was one of the developer of the iOS version of Signal, the secure messaging application created by Open Whisper Systems that I highly recommend.

It seems to me that Apple is doing more than marketing here. The company seems dedicated to offering a secure product to its customers. My greatest hope is that this encourages other companies to follow suit.

Written by Christopher Burg

February 26th, 2016 at 10:30 am

Los Angeles Teaching Homeless People To Not Be Homeless By Stealing Their Homes

without comments

Governments hate the homeless. Some people find this surprising but only because they don’t understand the nature of the State. The State exists on and for plunder. Every law it creates is created to further its plundering. That being the case, people who have nothing to take are effectively worthless to the State. Because rounding them up and killing them wouldn’t go over well with the general populace local municipalities have opted for another solution to their homeless “problem.” They try to make the lives of homeless individuals so miserable that they go elsewhere and becomes another municipality’s problem.

Los Angeles may be sinking to a new low in this endevour though. Recently city officials have begun teaching the homeless a lesson about being homeless by taking their homes:

Escalating their battle to stamp out an unprecedented spread of street encampments, city officials have begun seizing tiny houses from homeless people living on freeway overpasses in South Los Angeles.

Three of the gaily painted wooden houses, which come with solar-powered lights and American flags, were confiscated earlier this month and seven more are planned for impound Thursday, a Bureau of Sanitation spokeswoman said.

As is always the case in these situations, city officials are citing their own bureaucratic nonsense. These thefts are being perpetrated under the guise of sanitation. City officials also, as it always the case, claimed to be offering a better solution without offering any other solution:

Mayor Eric Garcetti’s spokeswoman, Connie Llanos, said he is committed to getting homeless people into permanent and not makeshift housing.

“Unfortunately, these structures can be hazardous to the individuals living in them and to the community at large,” Llanos said in a statement on the mayor’s behalf.

“When the city took the houses, they didn’t offer housing, they straight kicked them out,” Summers said.

What Mr. Garcetti means by permanent housing is getting the homeless out of the city so they’re no longer his problem. Maybe the homeless population of Los Angeles should consider seizing some of the government’s buildings. They’re technically unowned (since the State, being a criminal organization that has acquired everything in its possession through theft, cannot legitimately own property) and would provide permanent housing.

Written by Christopher Burg

February 26th, 2016 at 10:00 am

Due Process Was A Pain In The Ass Anyways

with one comment

I like to believe that a previous age existed where due process was value. If such an age existed it is obvious long since gone. More and more people seem willing to toss due process aside whenever it negatively impacts their ideological opposites. Throwing out due process is done in many ways. Some of those ways are as blatant as denying people rights based on where they were born. Other ways are more subtle, such as creating a new permit in order to punish an unreleased action:

As it stands, cops who suspect someone of prostitution must actually prove it before arresting them. But that’s a lot of work. So Eau Claire, Wisconsin, officials have a new plan: make non-sexual commercial companionship illegal without the proper paperwork.

To this end, the Eau Claire City Council is considering an ordinance that would require anyone advertising as an escort to get an occupational license from the government.

[…]

But because being an escort does not necessarily mean one is engaged in prostitution, police can’t just go around arresting anyone who advertises as an escort. Not yet, anyway. Ostensibly, cops must still interact with the individual and get them to agree to some sort of sexual activity for a fee. As Eau Claire Assistant City Attorney Douglas Hoffer put it, police are forced to do “intensive investigations” and get their targets to use “explicit language” in order to make charges stick.

Now city officials want to change that. Under their proposed legislation, escorts and escort businesses would have to be licensed by the city and subject to extensive regulations. Any escort operating without a license would be subject to a fine of up to $5,000.

But that’s not all: the proposed law would also punish customers who contract with unlicensed escorts. Hoffner said the idea is to end “demand” for prostitution. Anyone attempting to hire an unlicensed escort could also be charged up to $5,000, as well.

As the article notes, police cannot go after any escort business because many aren’t offering illegal services. This means the police have to effectively create a case with a sting operation or find evidence that a law was broken (but not evidence of a crime being committed since crimes require victims and voluntary prostration involves no victims). When situations like this arise it’s common for the local authorities to create some kind of permit requirement.

With permit requirements in place a police officer can arrest an escort and their customer on the grounds of the escort not having the proper paperwork. It’s a much easier violation to prove than prostitution. In fact the Eau Claire City Attorney admitted to exactly that:

Said Eau Claire City Attorney Stephen Nick: “This is another means, as opposed to actually having evidence of an act of prostitution, pandering, or offering a sexual act for money, so we can follow up” on sex-work suspects.

Cases like this should receive more publicity. It’s not enough for people to only get up in arms over overt violations of due process. People must learn about the more subtle methods as well.

Written by Christopher Burg

February 25th, 2016 at 11:00 am

Backup Locally

with one comment

There is no cloud, there are only other people’s computers. This is a phrase you should have tattooed to the inside of your eyelids so you can contemplate it every night. It seems like every company is pushing people to store their data in “the cloud.” Apple has iCloud, Google has its Cloud Platform, Microsoft has Azure, and so on. While backing up to “the cloud” is convenient it also means your data is sitting on somebody else’s computer. In all likelihood that data was uploaded in plaintext as well so it’s readable to the owner of the server.

I have good news though! You don’t have to upload your data to somebody else’s computer! If you use an iPhone it’s actually very easy to make local backups:

If you’re looking for comprehensive privacy, including protection from law enforcement entities, there’s still a loophole here: iCloud. Apple encourages the use of this service on every iPhone, iPad, and iPod Touch that it sells, and when you do use the service, it backs up your device every time you plug it into its power adapter within range of a known Wi-Fi network. iCloud backups are comprehensive in a way that Android backups still aren’t, and if you’ve been following the San Bernardino case closely, you know that Apple’s own legal process guidelines (PDF) say that the company can hand iMessages, SMS/MMS messages, photos, app data, and voicemail over to law enforcement in the form of an iOS device backup (though some reports claim that Apple wants to strengthen the encryption on iCloud backups, removing the company’s ability to hand the data over to law enforcement).

For most users, this will never be a problem, and the convenience of iCloud backups and easy preservation of your data far outweigh any risks. For people who prefer full control over their data, the easiest option is to stop using iCloud and use iTunes instead. This, too, is not news, and in some ways is a regression to the days before iOS 5 when you needed to use a computer to activate, update, and back up your phone at all. But there are multiple benefits to doing local backups, so while the topic is on everyone’s mind we’ll show you how to do it (in case you don’t know) and what you get from it (in case you don’t know everything).

I backup my iPhone locally and you should too. My local backups are encrypted by iTunes and are stored on fully encrypted hard drives, which is a strategy I also encourage you to follow. Besides enhancing privacy by not making my data available to Apple and any court orders it receives this setup also prevents my data from being obtained if Apple’s iCloud servers are breached (which has happened).

iPhones aren’t the only devices that can be backed up locally. Most modern operating systems have built-in backup tools that clone data to external hard drives. These are far superior backup tools in my opinion than “cloud” backup services. If you backup to fully encrypted hard drives you ensure that your data isn’t easily accessible to unauthorized parties. And you can store some of your encrypted backup drives offsite, say at your parents’ house or place of work, to ensure everything isn’t lost if your house burns to the ground.

Don’t rely on other people’s computers.

Written by Christopher Burg

February 25th, 2016 at 10:30 am

When Your Employer Hears About The Concept Of Defense In Depth

without comments

What happens when your employer first hears about the concept of defense in depth but knows jack shit about firearms? This:

After each employee at Lance Toland Associates gets their license, Toland presents them with a gun known as the judge. He says it is one of the most effective self-defense weapons and all his aviation insurance agencies carry them openly in the office.

“Everybody has one of these in their drawer or on their person. I would not want to come into one of my facilities,” Toland said. “It’s a 5 shot .410, just like a shotgun and you call it hand cannon.”

Having armed employees is a great way to bolster the physical security of your workplace. But the Taurus Judge is not a good weapon to arm employees with. It is ridiculously large, only has five shots, takes much longer to reload than a semiautomatic handgun. “But, Chris,” I hear you saying, “It shoots both .410 shotgun shells and .45 Colt!” To that I will point out that better guns are available for both. In addition to that the Taurus has a rifled barrel, which causes shot to fly out in a doughnut patter.

This is one of those stories where I really want to give the employer credit for thinking about the security of his employees but find myself having to shake my head because he chose a firearm based on Hollywood specifications (it looks scary) instead of effective specifications (such as a 9mm semiautomatic handgun). Granted, a Judge is better than nothing but if you’re going to encourage your employees to have a firearm you should take the extra step to equip them with something better than simply being better than nothing.

Written by Christopher Burg

February 25th, 2016 at 10:00 am

Posted in Gun Rights

Tagged with , ,

It’s Not Just Once iPhone The FBI Wants Unlocked

with 2 comments

There are people siding with the Federal Bureau of Investigations (FBI) in its current court battle with Apple. These misguided souls are claiming, amongst other nonsensical things, that the FBI only wants a single iPhone unlocked. They believe that it’s somehow OK for Apple to open Pandora’s box by releasing a signed firmware with a backdoor in it so long as it’s only for unlocking a single iPhone. Unfortunately, as those of us siding with Apple have been pointing out, this case isn’t about a single iPhone. The FBI wants a court precedence so it can coerce Apple into unlocking other iPhones:

In addition to the iPhone used by one of the San Bernardino shooters, the US government is pursuing court orders to force Apple to help bypass the security passcodes of “about a dozen” other iPhones, the Wall Street Journal reports. The other cases don’t involve terror charges, the Journal’s sources say, but prosecutors involved have also sought to use the same 220-year-old law — the All Writs Act of 1789 — to access the phones in question.

By setting a precedence in the San Bernardino case the FBI would have grounds to coerce Apple, and other device manufacturers, to unlock other devices. We know the FBI already has a dozen or so phones in the pipeline and it will certainly have more in the coming years.

Besides the precedence there is also the problem of the firmware itself. If Apple creates a signed firmware that disables iOS security features and automates brute forcing passwords it could be installed on other iPhones (at least other iPhone 5Cs but possibly other iPhone). With this firmware in hand the FBI wouldn’t even need to coerce Apple into helping each time, the agency could simply install the firmware on any compatible devices itself. This is why Apple believes creating such a firmware is too dangerous.

You can never believe the government when it claims to be taking an exceptional measure just once. Those exceptional measures always become standard practice.

Written by Christopher Burg

February 24th, 2016 at 11:00 am

The Abysmal State Of Credit Card Security

with one comment

Credit card fraud is a major problem. This isn’t surprising since until recently, at least here in the United States, credit cards included no security. Hoping to reduce fraud the credit card companies developed the Europay, Mastercard, and Visa (EMV) standard. Cards that comply with the EMV standard include a chip, which offers some security. But here in the United States two setbacks have prevent EMV from delivering better credit card security. First, the United States is adopting chip and signature, not chip and PIN. Secondly, most merchants still aren’t equipped to process EMV credit cards:

This week a management consulting company called The Strawhecker Group (TSG) released the results of a study that found that only 37 percent of US retailers were ready to process chip-embedded credit and debit cards. The slow adoption of chip-embedded cards leaves merchants open to accepting liability for fraud perpetrated with traditional, less-secure magnetic stripe cards.

I attribute this low adoption rate to the credit card companies failing to set a hard cutoff date for magnetic strips. Even if you get an EMV card it will contain an insecure magnetic strip so it can be used at merchants that aren’t setup to process EMV cards. Since all EMV cards are equipped with magnetic strips merchants aren’t motivated to get setup to process EMV cards.

When it comes to security hard cutoff dates are necessary. Without them users of the old insecure standard see no reason to upgrade. With them users grumble about having to upgrade but will begrudgingly do it out of necessity. Credit card companies need to set a date and tell merchants that after that date magnetic swipe transactions will be declined otherwise we’ll never get over this financial fraud fuckery.

Written by Christopher Burg

February 24th, 2016 at 10:30 am