A cyberattack last week that shut down a pipeline, which carries nearly half of the East Coast’s jet fuel and gasoline, raises the question for the zillionth time: When is the government going to start taking serious measures to prevent, or at least minimize, these debilitating—potentially catastrophic—incidents?
One possibly serious step is about to take place—the installment of a first-ever National Cyber Director, an official vested (at least on paper) with powers to order, coordinate, and enforce cybersecurity actions in the public and private sector.
It’s pathetic that this step has taken so long. As far back as 1984, a national-security directive, signed by President Ronald Reagan, warned that computer networks, which were just then emerging, were “highly susceptible to interception, unauthorized electronic access, and related forms of technical exploitation” by “terrorist groups and criminal elements.” In 1997, a commission appointed by President Bill Clinton sounded the alarms: “The capability to do harm…through information networks…is real; it is growing at an alarming rate; and we have little defense against it.”
The year after that, real cyberattacks, mounted by Russia and soon after by China, started happening. The Pentagon procured early-warning devices and created teams of specialists (there weren’t many back then) to run them. The National Security Agency switched from tapping analog phone lines to intercepting—and manipulating—digital cyber signals to keep up with the global shift.
But these measures protected military channels of communication, and even then imperfectly. As the internet exploded, nearly 90 percent of traffic flowed through privately controlled networks. The Clinton administration defined sectors of “critical infrastructure,” which were already dependent on networks—electrical power, oil and gas, water supplies, banking and finance, telecommunications, emergency services, and continuity of government in case of disaster—and created Information Sharing and Analysis Centers (ISACs), where government agencies (mainly the NSA) could share “best practices” with the companies that controlled those sectors.
But no mandatory measures were imposed. Executives sent their specialists to ISAC meetings, but they weren’t required to do anything afterward. Banks took aggressive measures to fend off hackers, because security was at the heart of their business. Electrical power and pipeline companies didn’t, for the most part, because security was very expensive and no attacks had yet taken place.
Now, of course, attacks are commonplace. Hundreds of cybersecurity companies have grown up to handle them. But the government’s power to set and enforce security measures is still weak. Private companies aren’t even required to report intrusions when they take place; many of them are hesitant to do so, lest consumers stop buying their products or their stock prices tank.
Russia’s hack last year of SolarWinds, a network management system used by 300,000 people, including many in government agencies, infused a renewed layer of urgency in dealing with this problem. But then-President Trump did nothing, and the incoming Biden administration—whose transition was obstructed by Trump officials refusing to offer briefings or access to classified material—was slow to take up the issue.
There was also a big dispute on how to handle cybersecurity generally. The Cyberspace Solarium Commission, which was established by Congress in 2019 to come up with “strategic” ideas on the subject, had recommended the appointment of a National Cyber Director. But several analysts—including some who had worked for President Obama and now for President Biden—preferred to strengthen and clarify the roles of existing agencies, placing them under the wing of the deputy national security adviser for cyber and emerging technology.
However, Solarium staffers countered, the National Security Council has never been very good at running operations—and it has few avenues for reaching out to private industry. A National Cyber Director, besides being the president’s chief adviser on the issue, would be focused on operations and would have powers to work with industry.
Key legislators sided with the Solarium’s approach. Biden complied, and he nominated for the job Chris Inglis, a Solarium commissioner and, more to the point, a former career official in the NSA, a rare specimen in that he worked on both the offense and defense side of the agency, and rose to deputy director. He left the job under pressure, along with the director, Gen. Keith Alexander, during the uproar over Edward Snowden’s revelations. But Inglis has remained widely respected, viewed as a straight shooter by most factions in the debate over government surveillance, and he is likely to be confirmed by the Senate—though it’s unclear when his confirmation hearings will take place.
Still, it remains to be seen whether Inglis will be able to do what the theoretical outlines of his post envision. One challenge will be how he coordinates with the deputy national security adviser, Anne Neuberger, another former NSA official who has worked well with Inglis in the past but is also known as a tenacious bureaucratic fighter. Her job is to set policy; Inglis’ will be to enforce and coordinate it with other agencies and with the private sector. Sometimes these sorts of power-sharing arrangements work out; sometimes they don’t.
A bigger challenge will be persuading or compelling private companies to step up their game in warding off attacks. Colonial Pipeline, the company hacked last week, forcing a stoppage to the flow of half the East Coast’s jet fuel and gasoline, is a private firm. According to journalist Kim Zetter (and subsequently confirmed by several other news outlets), the attack—which seems to have been mounted by a Russian criminal group called DarkSide—was aimed not at the pipeline’s valves or sensors directly but rather at the company’s IT system. However, the IT system provides a passageway to the valves and sensors, and the safeguards protecting that link are reportedly weak.
Colonial shut down its main pipelines as a precautionary measure, to prevent DarkSide from exploiting those weak links. Company spokesmen have been vague on when they can turn the valves back on, saying only that it will take them more than one or two days but less than six weeks.
One reason for this vagueness is that, as we’ve learned in the last few days, the company was hit with a ransomware attack. According to Bloomberg News, DarkWare stole nearly 100 gigabytes of data, then locked down some of Colonial’s computers and servers, demanding a ransom to turn them back on. It is not known whether Colonial is paying the ransom or what steps are being prepared if it doesn’t. Because Colonial is a private firm, it doesn’t have to reveal what it’s doing and not doing.
The National Cyber Director—whenever he is confirmed—will be able to assemble groups of industrialists, including the Colonial executives, to discuss common policies. Biden is also reportedly ready to issue emergency measures, requiring companies to report cyber intrusions to a new federal body—a cyber equivalent to the National Transportation Safety Board—and creating ways for the NSA to share information with private companies about existing threats, including more highly classified information than has previously been shared.
Bob Gourley, co-founder and chief technology officer of OODA, LLC, a cybersecurity firm, put the problem this way in an email to me: “There is one metric above all others that can tell you if an organization has a good cybersecurity program: Does the CEO really care? If the CEO does not really care, there is no hope for a risk-mitigating security program. If the leader does really care, it still requires hard work and vision, but at least there is hope.”
The question, for Inglis, Neuberger, and the others trying to put serious policies into place, is: Can they make CEOs, like those running companies like Colonial Pipeline, care?