In a recent blog post, Microsoft argued that the use of a vulnerability for Windows XP stolen from the NSA and released by the Shadow Brokers has caused widespread damage in the public domain, and the lesson that governments should learn from this incident is that government stockpiling of vulnerabilities that might be inadvertently revealed presents a hazard to safe computing around the world.
It’s certainly fair to suggest that the risks of government stockpiling vulnerabilities present a downside risk to safe computing that needs to be taken into account in deciding whether or not to reveal vulnerabilities to vendors so that they can be fixed. But the Microsoft statement implies that the only reasonable outcome of such a decision is to reveal vulnerabilities, and that doesn’t follow at all. That downside risk does add some additional weight to that side of the argument, but in any given instance, it might not be sufficient to tilt the scale in that direction depending on the weight on the other side of the argument.
Moreover and as the blog post states, Microsoft issued a fix for the vulnerability in question in March—a month before it was released by the Shadow Brokers. Good cyber hygiene would suggest that patches should be applied when they are made available, and WannaCrypt struck two months after that patch was issued. If I don’t wash my hands before eating and I get sick, it is indeed the fault of the microbes in the environment. But if I have a long history of not washing my hands before eating and not getting sick, it just means I’ve been lucky—not that I have microbial immunity. I should be washing my hands before every meal unless there’s some very good reason for not doing so, and system administrators should be patching their systems when patches are available unless there’s some very good reason for not doing so.
And finally, Windows XP has been supplanted by Windows 7 and Windows 10. Old systems are more vulnerable than newer systems, and administrators who are trying to save on costs by not moving to newer systems will usually run greater risks of compromise.
Does NSA bear any responsibility for the outbreak of WannaCrypt through its stockpiling of some vulnerabilities that were subsequently revealed? Sure, in the sense that if it had refrained from obtaining vulnerabilities at all, no vulnerabilities would have been released and the WannaCrypt creators would not have had the Shadow Brokers dump as a resource. (Of course, we don’t actually know that those creators used the Shadow Brokers dump—that’s an assumption that I happen to believe, but it is also possible that they would have discovered it independently. After all, Microsoft apparently did as well (but see footnote)).
But one could argue just as well that Github—the distribution channel for the Shadow Brokers—was equally responsible for making the vulnerability and exploit code widely available. So why isn’t anyone complaining about Github’s actions in this regard? At the very least, both entities share some degree of responsibility—NSA for allowing the vulnerability to be leaked, and Github for publicizing it.
Microsoft has been advocating the idea that government commit to disclosing all vulnerabilities for since Feb 2017. Different people can find the arguments for this idea more or less persuasive depending on their analysis (I’m personally not persuaded), but in my view, the WannaCrypt incident does not significantly strengthen the arguments for it as the blog post suggests it does.
Footnote: If Microsoft did have advance notice of the vulnerability from NSA that enabled it to fix the problem before the Shadow Brokers dump, that fact would indicate that NSA did reveal the vulnerability to the vendor, possibly albeit with some delay from its acquisition. Nick Weaver makes the same point in his recent posting.
UPDATE May 21, 2017: According to a recent report in Ars Technica, Kaspersky labs has found that WannaCrypt was most effective against Windows 7 machines, not Windows XP (as my original post above indicates). However, even if this report is true, it doesn’t negate the points I made in my post: keeping systems updated is an essential element of good cyber hygiene, and using more recent operating systems is generally more secure than using older systems. Parties that don’t follow these good practices should have very good reasons to accepting the resulting downside security risks.