Yesterday evening, the Biden administration released its much-anticipated “Executive Order on Improving the Nation’s Cybersecurity.” It is tempting to yawn; every administration in recent memory has done something of this kind, after all, and not always to significant effect. But this executive order deserves your attention. It contains concrete measures tailored to respond to lessons learned from recent crises, especially the SolarWinds and Microsoft Exchange compromises.
Is there more work to do? Obviously, yes. But to a significant extent that’s a job for Congress. The question at the moment is whether the Biden administration with this executive order has made good use of the limited tools that it controls directly. As we explain below, the answer is largely yes.
1. I’d like some context first. What can a president do with a mere executive order?
Let’s start by underscoring a critical point: The cybersecurity landscape is sprawling, and presidents can get only so far with a mere executive order. EOs are not statutes, after all. As a general proposition, presidents can’t just make rules that the private sector must obey; that’s Congress’s job.
So what good are EOs? For starters, presidents can use them to convey policy directives that bind the rest of the executive branch to take certain actions (so long as the directive is not itself illegal). An EO thus can be used to require federal entities to take actions that would be useful from a cybersecurity perspective. That might sound unimportant, given that the Department of Homeland Security’s (DHS’s) Cybersecurity and Infrastructure Security Agency (CISA) already has statutory authorities empowering it to issue binding cybersecurity directives to the rest of the civilian part of the executive branch. But when it comes to ensuring that agencies are fully motivated to comply with a directive, there’s just no substitute for the power of the presidential pen. Much of the new EO is an exercise of this sort of power.
In the same vein, EOs also are useful as a means to express the president’s will when it comes to shaping the procedures for interagency cooperation and coordination. We see a fair amount of this in the EO as well.
Presidents can also insist on certain terms being included in federal contracts and can use EOs to put such changes into motion. Depending on the terms—and depending on the desire of private-sector entities to compete for contracts under those conditions—this “procurement power” can be a significant lever to impact behavior outside the executive branch. As illustrated by the new EO, the procurement power not only can help the government protect itself but also can have spillover benefits for other entities (insofar as contract counterparties are motivated to alter products or services that are purchased by others).
2. Cybersecurity is a sprawling problem. Which specific parts of that problem does the EO aim to address?
The EO does not aim to address the entirety of the cybersecurity landscape. Instead, it focuses on a handful of important aspects of that larger challenge, with a clear (and expected) emphasis on those that were central to the SolarWinds/Sunburst and Microsoft Exchange messes. Here’s a shorthand overview of which parts of the order cover which general topics:
Stages in the Cybersecurity Cycle and Corresponding Contributions From the EO
- Section 3 (cloud services; multi-factor authentication)
- Section 4 (software supply chain standards; “Internet of Things” (IoT) transparency)
Minimizing impact of intrusion (pre-detection)
- Section 3 (data encryption; zero trust environment)
Detecting and responding to intrusion
- Section 2 (notification requirements; enabled/required vendor cooperation)
- Section 3 (additional information sharing)
- Section 6 (uniform incident response playbook)
- Section 7 (endpoint detection and response; centralized threat-hunting)
- Section 8 (logging requirements)
Learning (and disseminating) lessons from intrusion
- Section 5 (Cyber Safety Review Board)
That’s the big picture. But remember: The EO largely is focused on the federal government’s own cybersecurity posture. This makes sense, given the limits on what the president can accomplish unilaterally and given the role that SolarWinds/Sunburst—an intrusion that made its way deep into government and private-sector systems—played in giving rise to this effort.
3. Let’s get to the specifics. What exactly is in the operative sections of the EO?
The operative parts of the EO are Sections 2-9. Here’s what you need to know about each.
Section 2: Ensuring that private entities supplying IT/OT services to federal agencies can and will share threat information with CISA, the FBI, etc.
Section 2 takes on an aspect of the information-sharing problem that was highlighted—not in a good way—by the SolarWinds/Sunburst mess. Many federal information systems are run or supported by a private-sector service provider. These arrangements are sometimes based on contracts that are written in ways that fail to require the private vendor to share threat and incident information with other federal agencies, like CISA or the FBI. Some such contracts may even prohibit such sharing. At any rate, the experience with SolarWinds/Sunburst demonstrates that such contract complications really do interfere with the ability of CISA and the FBI to obtain quick cooperation from outside vendors.
If contracts are the problem, then the procurement power is the solution. And that’s what Section 2 is all about. It puts into motion a two-month process to review the contracting rules known as the Federal Acquisition Regulation (FAR) and the Defense Federal Acquisition Regulation Supplement (DFARS) with an eye toward a host of changes meant to address the aforementioned issue. The disclosure rules cover software products as well as, critically, any “support system for a software product or service.” This covers a potential reporting gap for core technology systems like identity management (the kinds of systems repeatedly abused in the SolarWinds/Sunburst crisis).
Eventually, FAR and DFARS will oblige service providers to do what boils down to three things. First, they’ll have to collect and preserve a broad array of information (including information relating to “event prevention,” not just incident response). Second, they’ll have to share such information with the government when the information relates to an incident (or potential incident) that is relevant to the contract in question. And they’ll have to cooperate with the federal entities involved in addressing or investigating incidents or potential incidents.
Another part of Section 2 sets in motion a separate change to FAR that eventually will result in a requirement that such providers also must affirmatively and “promptly” report cybersecurity incidents involving a product or service provided to the government (or the support systems for such services/products). This, too, clearly reflects concerns brought to light by recent experience.
While incident data takes the starring role here, notice that the administration appears interested in more than that. There is language as well about information necessary for the government “to respond to cyber threats, incidents, and risks.” This implies a much broader potential range of data, and it may implicate companies that sell this kind of threat intelligence and cybersecurity data directly to the federal government. It’s unclear if the new rule will come with waivers to permit that kind of sale or if it will be covered under these new disclosure guidelines. There is a wild card in Section (3) (e), moreover, with an open-ended direction to DHS and the Office of Management and Budget (OMB) to ensure service providers share this data “to the greatest extent possible.” Much of the practical impact of this section will come out in the implementation for these rules, so stay tuned.
Section 3: Getting the government’s own house in order (aka Cloud, More Cloud, and Don’t Trust Anybody)
It’s official: Cloud is in. While policies to encourage and support cloud adoption have been spreading across federal information technology over the past decade, the governance of cloud security has remained relatively immature. “Section 3. Modernizing Federal Government Cybersecurity” takes direct aim at this problem, charging CISA to develop secure cloud adoption practices and guidelines, offer cloud incident responses to the .gov, and set policy on how agencies should work with partners like CISA and the FBI in responding to cloud incidents. Per the executive order, OMB also will play an important role, drafting a federal cloud security strategy with associated guidance for agencies. The challenge for CISA (and OMB) is that the timeline to develop these complex policies is tight, 90 days or less in each case.
Section 3 also includes a small but significant set of actions to modernize FedRAMP (the federal government’s main security authorization program for cloud services). The language reads almost like language taken right from the Cloud Service Provider playbook, with emphasis on more-automated, more-rapid and less-duplicative reviews of new cloud services. The devil is in the details (the word “automation” is plenty fungible), but the inclusion of the entire cycle of FedRAMP (including continuous monitoring) pushes the scope of these prospective changes out a long way, into the full life cycle of cloud services. Most expansive is the possibility of mapping other security certifications and compliance frameworks to FedRAMP in order to speed cloud service providers’ authorization. That opens up significant opportunities for industry-led compliance frameworks, targeting, for example, only software as a service (SaaS) in order to supplant slower moving FedRAMP processes.
Alongside the cloud lovefest is the “zero trust architecture” concept. Zero trust is a loose collection of technology concepts and security design philosophies rooted in the assumption that a breach is inevitable and that, accordingly, there should be strict limits on access and authorization within a network. On this model, every device and every interaction is presumptively suspicious and should be scrutinized accordingly, and access should be strictly limited against a set of conditions and “roles” (for example, CEO, system administrator, and guest). The practical effect of zero trust is to require new processes to set and enforce rules for nearly everything that takes place—from opening files on a share drive to signing up a new mobile phone for Wi-Fi. The EO emphasizes zero trust as a means of ensuring more robust defenses even where supply chains are compromised or organizations breached. Implementing this kind of design philosophy is tough and time intensive. There is no one-stop-shop to “buy” zero trust right now as it is as much about changing culture as the type of technology users have. Whether a set of 60-day sprints can successfully kickstart that transformation remains to be seen.
A final note on Section 3: It also sets a 180-day deadline for all Federal Civilian Executive Branch (FCEB) entities to adopt multi-factor authentication and data encryption practices, and they will owe progress reports to CISA (and help from CISA) along the way.
Section 4: Getting serious about software supply chain security
If you came for the software security, grab your popcorn because “Section. 4 Enhancing Software Supply Chain Security” does not disappoint. It takes an unusually specific technical view in places, and runs nearly to the end of the alphabet for sub-section headings. It boils down to four broad areas:
- Software supply chain security guidance from the National Institute of Standards and Technology (informed by a heavy emphasis on the security of the development environment and integrity/vulnerability checking), with eventual OMB enforcement to ensure that agencies insist on compliance.
- Inclusion of a Software Bill of Materials (SBOM) requirement in that guidance (and a directive to the Department of Commerce (NTIA this time) to produce minimum specifications for SBOMs).
- A definition of (and trailing set of actions based on) the concept of “critical software,” including enforcement by OMB, along with support and guidance from DHS and NIST.
- Encouragement of IoT security and labeling.
The core software supply chain security guidance is in the hands of NIST, which is a popular recurring character throughout the section. This is tricky territory for NIST, the Maryland-based boffins whose remit most generally tracks toward developing best practices and extensive consensus building around voluntary guidelines and special publications. Software supply chain security has been a challenge for policymakers in part because of the gap between software development in standards and development in practice. Placing most to all of the software supply chain security responsibility on a standards and technology research body may prove challenging.
Software development gets a lot of attention in this section, but the focus is largely in line with ensuring integrity of code at the root of the software supply chain. There are also several requirements for vendors to attest to the development and security standards required of them, including a limited public disclosure. Injecting this kind of information about vendor security posture into the market is a good thing and a useful signal for potential customers of these companies outside the federal space. And SBOM is here at last; look for more in the form of a minimum viable specification from Commerce in the next 60 days.
All of these supply chain guidelines and practices will be applied to vendors of “critical software,” a term the EO introduces but will need DHS, the Office of the Director of National Intelligence, and others to precisely define. Agencies will similarly have new restrictions and guidance on how they operate “critical software,” but it is unclear how this will interact with existing programs like the DHS/NIST High Value Asset (HVA) program.
Tucked into the later stages of Section 4 are a sequence of actions on IoT supply chain security addressing baseline security and development practices of vendors with input from NIST and a labeling program supported by the Federal Trade Commission. This includes a big role for NIST to identify and ingest all existing cybersecurity and secure software development labeling schemes and best practices and, in the case of software, possibly produce a “tiered software security rating system.” The prospect for movement on cybersecurity labeling for IoT is exciting, answering several calls for such a program in the United States and matching up with policies in the United Kingdom, Singapore and elsewhere. It may yet prove necessary for Congress to act in this space, however, before there is widespread uptake of such transparency measures.
Section 5: An NTSB for significant cyber incidents
Another much-anticipated idea breathed to life by the EO is the Cyber Safety Review Board. The board will be an interagency group convened by DHS and featuring representatives from the Defense Department, the Justice Department, CISA, the National Security Agency (NSA) and the FBI, as well as appointees from the private sector (tailored to the circumstances of the particular incident under review). The idea is to review “significant cyber incidents” (which might involve breach of federal systems but also might encompass certain private-sector incidents), in order to extract lessons learned and then disseminate those lessons appropriately. The model is inspired by the National Transportation Safety Board, which reviews major transportation accidents like airplane crashes and whose detailed investigative write-ups help inform risk assessments, vendor evaluations and operational practices.
The EO specifies a specific first job for the Cyber Safety Review Board: extracting lessons learned from SolarWinds/Sunburst. Of course, the EO itself shows that this learning process is already well in motion. That said, the EO also makes clear that the board in this first trot around the track also should use the occasion to identify lessons learned about itself, enabling the board process to work better in the future.
Sections 6, 7, and 8: Getting the federal cyber house in order, part II
Sections 6, 7 and 8 return to the theme of compelling federal agencies to improve their cybersecurity posture.
The sequence starts with “Section 6. Standardizing the Federal Government’s Playbook for Responding to Cybersecurity Vulnerabilities and Incidents,” addressing the fact that the endless array of FCEB agencies apparently have an endless variety of incident response playbooks. No doubt the variation at least partially reflects tailoring to local circumstances. But the net result of the variety is to make it more difficult for CISA, as the centralized defensive lead, to ensure compliance effectively. So, Section 6 charges CISA to develop a standard incident response playbook (in consultation with a host of other agencies and organizations). It also tasks CISA to review and update the document annually with the NSA and clarifies CISA’s authority to review agency response plans to ensure conformity. In this way, Section 6 contributes to the ongoing process of moving CISA further out front as the central cybersecurity organization for FCEB agencies.
This is as good a time as any to recall that, in the words of one former House staffer, CISA is “overworked, understaffed and in one sense fighting half-blindfolded.” Increased authority is good, in other words, but it needs to be matched by increased resources. The next two years will be critical for CISA as Jen Easterly takes command and a prospective infusion of budget paired with a surge in hiring may reshape the agency for good.
On to “Section 7. Improving Detection of Cybersecurity Vulnerabilities and Incidents on Federal Government Networks,” which responds to issues that have loomed large in the SolarWinds/Sunburst postmortems: the lack of strong (and consistent) endpoint detection-and-response (EDR) capabilities across many FCEB entities, and limits that existed (until recently at least) on the ability of CISA to conduct threat-hunting across the systems of those entities (with or without their permission). Section 7 directs all agencies to launch initiatives to improve their EDR capabilities on a tight timeline, as well as other initiatives to enhance CISA’s centralized threat-hunting capabilities. Relatedly, Section 7 also obliges agencies to enter into agreements to provide CISA access to object-level data in support of CISA’s Continuous Diagnostics and Mitigation Program.
Threat-hunting aficionados will recall that Congress actually granted CISA expanded (and clarified) centralized threat-hunting authority in Section 1705 of the fiscal 2021 National Defense Authorization Act. In a sign of the importance of that fresh authority, the EO calls for CISA to produce a report in 90 days explaining how they are putting this authority to work and also requires additional reporting every quarter describing its ongoing use. Prediction: These reports will confirm that CISA is trying to make the most of Section 1705 but will need more resources to make full use of it.
Next up is “Section 8. Improving the Federal Government’s Investigative and Remediation Capabilities.” Section 8 is very much a low-hanging fruit sort of provision, as it zeroes in on the lack of network logs in many FCEB systems. As emphasized by some of the most pointed criticism of vendors in the aftermath of SolarWinds/Sunburst, the amount (and nature) of data recorded to network logs, and how that data is retained and accessed, can have a major influence on the speed and success of cyber incident response. DHS accordingly is charged with developing standard practices on logging, while OMB has responsibility to enforce those practices across civilian agencies.
Notably, some of the SolarWinds/Sunburst postmortems have suggested that FCEB logging issues sometimes result from financial constraints that led agencies to contract for vendor services at price levels that did not include robust logging. Perhaps in response to this, the EO specifically directs OMB to work with agency leaders to ensure they have the resources needed to meet the new logging requirements.
Section 9: Don’t forget about national security systems!
At this point it should be clear that most of the EO focuses on the .gov—that is, the civilian agencies and the characters you know and love, like the Internal Revenue Service, the Environmnetal Protection Agency, the Department of State and more. But the Department of Defense and the intelligence community like the cloud and computing too, thank you very much. So, what about them?
Scattered requirements in the main body of the EO, and especially in “Section 9. National Security Systems,” ensure that there are processes for the secretary of defense to incorporate all of the standards and guidelines developed in the order for national security systems where applicable and appropriate. National security systems (and the .mil environment more broadly) thus are not the focus of this EO, but they do receive some limited attention (with an emphasis on synchronizing with civilian agencies’ security controls and data-collection practices).
4. 8,000+ Words Later, Where Are We?
If you wanted a document that deals with critical infrastructure or taking the fight to foreign ransomware franchises, this EO is not for you. And that should come as no surprise. Despite the timing of the Colonial Pipeline incident last week, the EO is the culmination of an intensive period of work by the Biden cyber team sparked by the double-punch of SolarWinds/Sunburst and Microsoft Exchange vulnerabilities. The watchword that follows from those episodes is software security, with a heavy focus on federal incident response, investigation and remediation.
That’s the center of gravity of the new EO. It is not revolutionary, but that shouldn’t be the measure of success. The important question is whether it responds smartly to the important lessons learned from painful recent experience, and it does seem to do that. FCEB entities will be forced to take a variety of important steps (perhaps most notably, basic things like the new encryption and multi-factor authentication mandates). CISA will be pushed a bit further toward center stage, as the central cybersecurity organ of the .gov. Along the way, the EO covers a lot of ground on secure software development and more risk-aware treatment of “critical software” alongside new policies on cloud security and attempts to shift the .gov environment to a new model of trust. There’s even an IoT security-transparency nudge.
Plenty of questions and challenges remain, which is to be expected. For example, the EO does not wrestle with “org chart” challenges, such as the role in all of this for the Federal Acquisition Security Council or the federal chief information security officer, who go unmentioned. Nor does it grapple directly with the future role of the national cyber director (NCD), which soon will be filled by the esteemed Chris Inglis. It remains to be seen how (if at all) the new programs and reporting lines—many of which include reporting to, or decisions from, the assistant to the president for national security affairs—will change at that point (while the NCD is not mentioned in the main body of the EO, there is an option to allow that “portions of this order may be modified to enable the NCD to fully execute its duties and responsibilities”). That may influence the timelines of some of these activities as well.
Bottom line: The EO goes far toward picking the low-hanging fruit (and some not-so-low-hanging fruit) highlighted by recent challenges. In terms of systematic improvements for the nation’s cybersecurity, attention should now shift to Congress. The EO’s prescriptive elements may be a useful model for how Congress might take steps to bolster security in at least some critical infrastructure subsectors, for example. And only Congress can provide the increased resources that ultimately are needed to realize the full impact of these new policies.
Note: The authors will take questions on this article on Friday, May 21, at 1 p.m. ET. You can register here or below.