Rules of the Road: The Need for Vulnerabilities Equities Legislation

By Sharon Bradford Franklin, Andi Wilson Thompson
Wednesday, November 22, 2017, 7:00 AM

When the government discovers a bug in any computer hardware or software system, should it immediately inform the device or software manufacturer, so the company can create a patch and protect its customers’ cybersecurity? When should the government be permitted to keep the information to itself, and exploit the vulnerability to hack into devices in support of law enforcement and intelligence agency operations? By promptly notifying manufacturers and allowing them to repair cyber vulnerabilities, the government serves its responsibility to protect the nation from cyber attacks, which can harm not only our information systems, but also our financial systems, critical infrastructure, and public safety. Yet, as we have long known, the government also restricts knowledge of some cyber vulnerabilities for exploitation in law enforcement and intelligence operations.

Despite public knowledge that government hacking occurs, very little information had been disclosed about how the government decides when to disclose vulnerabilities and when to exploit them. In 2010, though it was unknown to the public, the government first adopted a formal Vulnerabilities Equities Process (VEP) through which it makes these decisions. In April 2014, Michael Daniel, the Obama administration’s cybersecurity coordinator provided some details on the process through a blog post. In early 2016, in response to a Freedom of Information Act request, the Obama administration released a heavily redacted version of the VEP that had been adopted in 2010. These documents provided some information, but left many unanswered questions. Then, last week, Rob Joyce, the current White House cybersecurity coordinator, released a charter for the administration’s Vulnerabilities Equities Process, providing welcome transparency for this process. However, as discussed below, more work remains to be done to ensure that the VEP adequately protects the public’s cybersecurity and promotes transparency.

A robust VEP is necessary to ensure that the government weighs all relevant equities in deciding whether to exploit a vulnerability, including those favoring disclosure and patching to protect the cybersecurity of the public. Since government hacking is a reality and the White House maintains that it would be “unilateral disarmament” to disclose all vulnerabilities and give up this capability, strict and transparent rules of the road are critical. It is also worth focusing on regulation of government hacking because the cybersecurity risks created by government withholding and exploitation of existing cyber flaws pale in comparison to the threats presented by the government’s push to mandate encryption backdoors. As the government renews its campaign to demand access to encrypted communications, privacy advocates and cybersecurity experts should recognize that a meaningful VEP acknowledging the reality of government hacking is far preferable to any scheme that would permit government agents to unlock and access all encrypted communications.

Deputy Attorney General Rod Rosenstein argues that device manufacturers should create “responsible encryption” that allows access only with judicial authorization. But what he calls “responsible encryption” actually means “insecure encryption,” because requiring providers to retain the capability to access encrypted data equates to demanding that providers create new vulnerabilities on all devices and software systems. Government hacking, by contrast, exploits existing vulnerabilities. As the new VEP charter states, it is “important to note that the [U.S. government] has not created these vulnerabilities” that may be exploited through the VEP. Furthermore, whereas a mandate for encryption backdoors would affect all devices and applications, the vulnerabilities that are withheld under the VEP pose privacy and cybersecurity risks only to the individuals and businesses that use the particular flawed systems. Thus, as OTI’s Director Kevin Bankston has written in Lawfare, rather than debating the creation of encryption backdoors, we should focus instead on vulnerability disclosure and government hacking. Regulated government hacking that conforms to a well-designed VEP — and which should also be subject to rules similar to those governing wiretapping — will help the government overcome its access challenges while enabling us to limit the privacy and cybersecurity risks to the public.

The VEP charter released last week represents an important step forward in minimizing the potential harms from government exploitation of vulnerabilities and in providing transparency for this process. Most notably, the charter makes public for the first time the full roster of government entities participating in the Equities Review Board (ERB) as well as a detailed list of the equities the board must consider. Importantly, these equities include such factors as “[h]ow broad is the range of products or versions affected” and whether government use of the vulnerability would “provide specialized operational value against cyber threat actors or their operations.”

However, to be confident that the process is acting as a counterweight to the dangers that increased government hacking poses, much more must be done. First, the VEP should be modified to ensure sufficient weight is given to the security and privacy of ordinary consumers. The list of members that participate in the ERB process of evaluating vulnerabilities is heavily slanted toward the intelligence and law enforcement communities. The charter appears to provide a system of “one organization, one vote.” Therefore, the disproportionate representation of actors that may have a vested interest in keeping vulnerabilities secret could bias the process against disclosure. Moreover, although the VEP charter states in the “Background” section that “[i]n the vast majority of cases, responsibly disclosing a newly discovered vulnerability is clearly in the national interest,” the charter does not include any presumption in favor of disclosure or any other standards for weighing the competing equities.

Second, though the charter itself is now public, it fails to include meaningful reporting requirements to enable Congress and the public to assess the VEP’s operation. Although the White House has promised to provide “annual reports” that will show the actual practice under the VEP, the charter itself does not provide for any public reporting, and only specifies that annual reporting “may be provided to the Congress.” For example, the National Security Agency has told the public that it discloses 91% of the vulnerabilities it finds, without specifying the actual numbers of vulnerabilities found compared to the number disclosed. The government should also be required to report the age of the vulnerabilities it is disclosing and the number of disclosed vulnerabilities that were initially retained and exploited for some period of time. More granular reporting, and requirements that the information be provided to Congress and the public, are important to permit evaluation of the VEP’s implementation.

Third, we are troubled by the charter’s exception that would permit the government to enter Non-Disclosure Agreements (NDA), Memoranda of Understanding (MOU), or other agreements that would prevent the government from disclosing certain vulnerabilities. This provides the government with a significant loophole that could be used to prevent disclosure even where the vulnerability presents serious risk of harm to the public if left unpatched. The government has already relied on exemptions for various types of commercial technology disclosure in criminal cases like San Bernardino, Playpen, and TrueAllele, demonstrating that the exemption for NDAs could seriously undermine the VEP process.

Perhaps most critically, the charter is policy, not law, and many components could be changed at any time, possibly in secret. This includes fundamental VEP features like the factors considered when deciding whether or not to disclose a vulnerability, or the scope of vulnerabilities that are included in the process. This risk of dramatically undermining an already uncertain process can only be addressed by codification — by Congress passing legislation that defines the rules and procedures of the process.

Congress has already begun the important process of codifying the VEP. A bipartisan coalition has introduced the Protecting our Ability to Counter Hacking Act of 2017 or “PATCH Act,” H.R. 2481 and S. 1157. The act would require that when a federal agency learns about a vulnerability that is not “publicly known,” the agency must refer the information to a Vulnerability Equities Review Board to determine whether the vulnerability should be disclosed to the vendor for repair or retained for exploitation. The bill specifies a list of equities to be considered, including the “harm that could occur if an actor, such as an adversary of the United States or a criminal organization, were to obtain information about the vulnerability” and “[w]hether the vulnerability is needed for a specific ongoing intelligence or national security operation.” Importantly, the bill also mandates that any vulnerabilities that are retained for exploitation must be periodically reevaluated, and it includes requirements for annual reporting to Congress and the public.

As with the administration’s VEP charter, the PATCH Act’s text could benefit from further refinement. For example, like the charter, the act fails to provide any standard by which the competing equities should be weighed, and neither specifies that there should be a presumption of disclosure or any other weighting favoring the sharing of information with the vendor so that the flaw can be repaired.

Nonetheless, codification of the VEP through a bill like the PATCH Act is important, and congressional hearings can provide an opportunity to explore how the process is being implemented as well as what further improvements should be made. The new charter should not be seen as the end of the conversation, and we must ensure that its release does not chill the desire for reform and codification. The opposite should be true — the new public charter provides more information and thereby gives privacy advocates, cybersecurity experts, and reformers in Congress a better base with which to work. The new and welcome transparency regarding the current process will permit us to better evaluate what is actually happening, and enable us to identify the good, bad, and ugly, and to enact legislation, like the PATCH Act, to address concerns about vulnerabilities in the age of increased government hacking.