Should security vulnerabilities be disclosed? What if they could be used to kill somebody? That’s a question Robert Graham recently asked on his blog:
Historically, we’ve dealt with vendor unresponsiveness through the process of “full disclosure”. If a vendor was unresponsive after we gave them a chance to first fix the bug, we simply published the bug (“drop 0day”), either on a mailing list, or during a talk at a hacker convention like DefCon. Only after full disclosure does the company take the problem seriously and fix it.
So let’s say I’ve found a pacemaker with an obvious BlueTooth backdoor that allows me to kill a person, and a year after notifying the vendor, they still ignore the problem, continuing to ship vulnerable pacemakers to customers. What should I do? If I do nothing, more and more such pacemakers will ship, endangering more lives. If I disclose the bug, then hackers may use it to kill some people.
The problem is that dropping a pacemaker 0day is so horrific that most people would readily agree it should be outlawed. But, at the same time, without the threat of 0day, vendors will ignore the problem.
As the article explains the lack of vendor responsiveness is major problem in the computer security field. Vendors often have the attitude that if a vulnerability isn’t widely know then it’s not dangerous. Of course they never stop to consider the fact that the person reporting the vulnerability found it so in all likelihood other people will find or have found it as well. And that lack of forethought will lead them to ignore the problem, which will ensure more people receive the vulnerable devices.
In this debate I’m a firm believer in, what Graham refers to as, coder’s rights. It’s unfortunate but often the only way to get a company to address a major security vulnerability is to attack its bottom line. The fact is any vulnerability in a medical device that could lead to human death would absolutely destroy the manufacturer’s reputation. Impending lawsuits would also do some financial damage.
Additionally there is the fact that concealing the vulnerability will often lead to continued product sales. That means a continuously growing number of people at risk of being killed by an exploit. By going public with the exploit the amount of potential damage can be limited.
But regardless of the side you sit on this debate is an interesting one.