On Friday, Sept. 26, 2014, a telecommunications contractor named Brian Howard woke early and headed to Chicago Center, an air traffic control hub in Aurora, Illinois, where he had worked for eight years. He had decided to get stoned and kill himself, and as his final gesture he planned to take a chunk of the U.S. air traffic control system with him.
Court records say Howard entered Chicago Center at 5:06 a.m. and went to the basement, where he set a fire in the electronics bay, sliced cables beneath the floor, and cut his own throat. Paramedics saved Howard’s life, but Chicago Center, which controls air traffic above 10,000 feet for 91,000 square miles of the Midwest, went dark. Airlines canceled 6,600 flights; air traffic was interrupted for 17 days.
Howard had wanted to cause trouble, but he hadn’t anticipated a disruption of this magnitude. He had posted a message to Facebook saying that the sabotage “should not take a large toll on the air space as all comms should be switched to the alt location.” It’s not clear what alt location Howard was talking about, because there wasn’t one. Howard had worked at the center for nearly a decade, and even he didn’t know that.
At any given time, around 7,000 aircraft are flying over the United States. For the past 40 years, the same computer system has controlled all that high-altitude traffic—a relic of the 1970s known as Host. The core system predates the advent of the Global Positioning System, so Host uses point-to-point, ground-based radar. Every day, thousands of travelers switch their GPS-enabled smartphones to airplane mode while their flights are guided by technology that predates the Speak & Spell.
If you’re reading this at 30,000 feet, relax—Host is still safe, in terms of getting planes from point A to point B. But it’s unbelievably inefficient. It can handle a limited amount of traffic, and controllers can’t see anything outside of their own airspace—when they hand off a plane to a contiguous airspace, it vanishes from their radar.
The FAA knows all that. For 11 years the agency has been limping toward a collection of upgrades called NextGen. At its core is a new computer system that will replace Host and allow any controller, anywhere, to see any plane in U.S. airspace. In theory, this would enable one air traffic control center to take over for another with the flip of a switch, as Howard seemed to believe was already possible.
NextGen isn’t vaporware; that core system was live in Chicago and the four adjacent centers when Howard attacked, and this spring it’ll go online in all 20 U.S. centers. But implementation has been a mess, with a cascade of delays, revisions, and unforeseen problems. Air traffic control can’t do anything as sophisticated as Howard thought, and unless something changes about the way the FAA is managing NextGen, it probably never will.
This technology is complicated and novel, but that isn’t the problem. The problem is that NextGen is a project of the FAA. The agency is primarily a regulatory body, responsible for keeping the national airspace safe, and yet it is also in charge of operating air traffic control, an inherent conflict that causes big issues when it comes to upgrades. Modernization, a struggle for any federal agency, is practically antithetical to the FAA’s operational culture, which is risk-averse, methodical, and bureaucratic. Paired with this is the lack of anything approximating market pressure. The FAA is the sole consumer of the product; it’s a closed loop.
The first phase of NextGen is to replace Host with the new computer system, the foundation for all future upgrades. The FAA will finish the job this spring, five years late and at least $500 million over budget. Lockheed Martin began developing the software for it in 2002, and the FAA projected that the transition from Host would be complete by late 2010. By 2007, the upgraded system was sailing through internal tests. But once installed, it was frighteningly buggy. It would link planes to flight data for the wrong aircraft, and sometimes planes disappeared from controllers’ screens altogether.
As timelines slipped and the project budget ballooned, Lockheed churned out new software builds, but unanticipated issues continued to pop up. As recently as April 2014, the system crashed at Los Angeles Center when a military U-2 jet entered its airspace—the spy plane cruises at 60,000 feet, twice the altitude of commercial airliners, and its flight plan caused a software glitch that overloaded the system.
Even when the software works, air traffic control infrastructure is not prepared to use it. Chicago Center and its four adjacent centers all had NextGen upgrades at the time of the fire, so nearby controllers could reconfigure their workstations to see Chicago airspace. But since those controllers weren’t FAA-certified to work that airspace, they couldn’t do anything. Chicago Center employees had to drive over to direct the planes. And when they arrived, there weren’t enough workstations for them to use, so the Chicago controllers could pick up only a portion of the traffic.
Meanwhile, the telecommunications systems were still a 1970s-era hardwired setup, so the FAA had to install new phone lines to transfer Chicago Center’s workload. The agency doesn’t anticipate switching to a digital system (based on the same voice over IP that became mainstream more than a decade ago) until 2018. Even in the best possible scenario, air traffic control will not be able to track every airplane with GPS before 2020. For the foreseeable future, if you purchase Wi-Fi in coach, you’re pretty much better off than the pilot.
A big, high-risk infrastructure upgrade like NextGen will never move as fast as change associated with consumer technology, but the real hurdles are not technical, they’re regulatory. In the private sector, new technologies can be developed freely regardless of whether the law is ready for them. Think of Uber, Lyft, and Airbnb: Outdated regulations slowed them down, but consumer demand is forcing the law to evolve. This back-and-forth is what lets tech companies move fast and break things without risking our safety. But when the government upgrades its technologies, regulations intercede before a single line of code is written.
The government procurement process is knotted with rules and standards, and new technology has to conform to those rules whether or not they’re efficient or even relevant. These issues screwed up HealthCare.gov and are screwing up the Department of Veterans Affairs and a dozen other agencies that need computers and software that work. The current process stifles innovation from the start and mires infrastructures like NextGen, which need to carry us far into the future, in the rules of today.
The government needs to change its procurement process, and it’s got to let go of its stranglehold on air traffic control. Privatization isn’t necessarily the answer. Canada, the UK, Germany, Sweden, and Australia operate air traffic control through various separate entities, from semiprivate to nonprofit to government corporations, that help facilitate the necessary push and pull between technological risk-taking, regulatory caution, and pressure from end users.
The first real pressure on the FAA to show results came, ironically, from Howard. He forced what was essentially the first real-time operational test of the new system. When NextGen faltered, the program faced a level of widespread public scrutiny that it had previously evaded, and the FAA had to respond. The agency published a review of its contingency processes, including new plans to enable control centers to assist each other in emergencies. Brian Howard, hell-bent on destruction, was the best thing to happen to our air traffic control system in years.
More from WIRED: