Article 2Z8JX Former NSA Official Argues The Real Problem With Undisclosed Exploits Is Careless End Users

Former NSA Official Argues The Real Problem With Undisclosed Exploits Is Careless End Users

by
Tim Cushing
from Techdirt on (#2Z8JX)
Story Image

As leaked NSA software exploits have been redeployed to cause computer-based misery all over the world, the discussion about vulnerability disclosures has become louder. The argument for secrecy is based on the assumption that fighting an existential threat (terrorism, but likely also a variety of normal criminal behavior) outweighs concerns the general public might have about the security of their software/data/personal information. Plenty of recent real-world examples (hospital systems ransomed! etc.) do the arguing for those seeking expanded disclosure of vulnerabilities and exploits.

Former Deputy Director of the NSA Rick Ledgett appears on the pages of Lawfare to argue against disclosure, just as one would have gathered by reading his brief author bio. Ledgett's arguments, however, feel more like dodges. First off, Ledgett says the NSA shouldn't have to disclose every vulnerability/exploit it has in its arsenal, an argument very few on the other side of the issue are actually making. Then he says arguments against exploit hoarding "oversimplify" the issue.

The WannaCry and Petya malware, both of which are partially based on hacking tools allegedly developed by the National Security Agency, have revived calls for the U.S. government to release all vulnerabilities that it holds. Proponents argue that this would allow patches to be developed, which in turn would help ensure that networks are secure. On its face, this argument might seem to make sense-but it is a gross oversimplification of the problem, one that not only would not have the desired effect but that also would be dangerous.

At this point, you'd expect Ledgett to perform some de-simplification. Instead, the post detours for a bit to do some victim-blaming. It's not the NSA's fault if undisclosed exploits wreak worldwide havoc. It's the end users who are the problem -- the ones who (for various reasons) use outdated system software or don't keep current with patches. This isn't a good argument to make for the very reasons outlined in Ledgett's opening paragraph: software vendors can't patch flaws they're unaware of. This is where disclosure would help protect more users, even if it meant the loss of some surveillance intercepts.

Then Ledgett argues the NSA's leaked exploits weren't really the problem. If they hadn't been available, the malware purveyors just would have used something else.

The actors behind WannaCry and Petya, believed by some to be from North Korea and Russia, respectively, had specific goals when they unleashed their attacks. WannaCry seemed to be straightforward but poorly executed ransomware, while Petya appeared to have a more sinister, destructive purpose, especially in the early Ukraine-based infection vector. Those actors probably would have used whatever tools were available to achieve their goals; had those specific vulnerabilities not been known, they would have used others. The primary damage caused by Petya resulted from credential theft, not an exploit.

This is undoubtedly true. Bad actors use whatever tools help them achieve their ends. It's just that these specific cases -- the cases used by Ledgett to argue against increased disclosure -- were based on NSA exploits vendors hadn't been informed of yet. The patches that addressed more current vulnerabilities weren't issued until after the NSA told Microsoft about them, and it only did that because its toolset was no longer under its control.

Ledgett also points out that the NSA does better than most state entities in terms of disclosure:

Most of the vulnerabilities discovered by the U.S. government are disclosed, and at the National Security Agency the percentage of vulnerabilities disclosed to relevant companies has historically been over 90 percent. This is atypical, as most world governments do not disclose the vulnerabilities they find.

Maybe so, but there's not much honor than just being better than the worst governments. Ledgett only says the NSA is better than "most." This doesn't turn the NSA into a beacon of surveillance state forthrightness. All it does is place it above governments less concerned about the security and wellbeing of their citizens.

Ledgett then goes back to the well, claiming a) the two recent attacks had nothing to do with the NSA, and b) disclosing vulnerabilities would make the NSA less effective.

WannaCry and Petya exploited flaws in software that had either been corrected or superseded, on networks that had not been patched or updated, by actors operating illegally. The idea that these problems would be solved by the U.S. government disclosing any vulnerabilities in its possession is at best naive and at worst dangerous. Such disclosure would be tantamount to unilateral disarmament in an area where the U.S. cannot afford to be unarmed" Neither our allies nor our adversaries would give away the vulnerabilities in their possession, and our doing so would probably cause those allies to seriously question our ability to be trusted with sensitive sources and methods.

The problem here is that Ledgett ignores the obvious: leaked NSA tools helped create the problem. The NSA never disclosed these vulnerabilities to affected software vendors -- at least not until it became obvious it could no longer keep these tools secret.

I'm guessing the NSA is already living through the last part of Ledgett's paragraph. A set of effective, still-undisclosed vulnerabilities being digitally spirited away and dumped into the public's lap probably makes it less likely foreign surveillance partners will be sharing their malware toolkits with the NSA.

This leads right into another argument against vulnerability hoarding: it has been shown with complete clarity that the NSA can't guarantee its exploits will never be used by criminals and malicious governments. The leak of its toolkit shows any suggestion that only the "good guys" will have access to undisclosed vulnerabilities is both ignorant and arrogant. The NSA isn't untouchable. Neither are all the surveillance partners the NSA has shared its tools with.

In the end, it's the private sector's fault, according to Ledgett. The solution is for vendors to write better software and end users to patch more frequently. This is good advice, but not an absolution of the NSA's vulnerability secrecy.

The NSA needs to do better balancing its needs and the security of the general public. Very few people are arguing the NSA should have zero undisclosed exploits. But the exploits dumped by the Shadow Brokers affected older versions of Microsoft system software dating back to Windows XP and they still weren't patched until the exploits had already been made public. These were exploits some in the NSA thought were too powerful, and yet, the NSA did nothing until the malware offspring of its secret exploit stash were taking down systems all over the world.



Permalink | Comments | Email This Story
External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments