Coal is getting killed in the U.S. That’s largely because its main customer, the electricity industry, is switching to fuels or sources that are either cheaper or cleaner (or both), like natural gas and renewables. In March, for example, coal accounted for only 24 percent of electricity generated down in the U.S.—that’s down from 33 percent in March 2015 and from 41 percent in March 2014. So far this year, U.S. coal production is just two-thirds of what is was last year.
Coal is being massacred by a combination of market forces—natural gas is really cheap and abundant, renewables are getting cheaper, and meanwhile state and federal regulations and mandates are pushing power producers to use fuels or power sources that create fewer emissions. In such an environment, coal has a hard time competing. (Natural gas produces about half the amount of carbon dioxide per British thermal unit created as coal does.)
The reasons these anti-coal forces have gained real strength in the last several years are largely technological developments and the logic of industrial scale. But coal producers’ response has generally not been to innovate but to respond defensively. They fight regulations and help fund lawsuits aimed at stopping policies that discourage coal use. They back politicians—mostly Republicans but some Democrats—who try to reduce subsidies and support for renewables and attempt to shape laws and regulations in ways that are more favorable to coal. But it hasn’t been working: Republican presidential candidates who run hard on coal keep losing and will likely lose again. In many of the biggest states—California, the entire Eastern seaboard—greens have far more lobbying clout than coal miners.
It’s easy to look back and conclude that it was obvious that the eventual rise of cheap natural gas and utility-scale renewals would spell doom for coal. But it didn’t have to turn out this way. The challengers to coal gained market share and social legitimacy because of unpredictable strides in technology, innovation, and investment. Fracking, continually improved upon and applied at immense scale, made cleaner-burning natural gas extraordinarily cheap, and hence attractive to utilities. Renewables were a laughably small segment of the country’s energy sources several years ago. But a decade of investments in small wind and solar farms, and then in bigger wind and solar farms, and then in huge ones, has likewise made those sources more appealing.
What if coal producers had taken a cue from their rivals and invested systematically in technologies and innovations to make their product greener? Could they have figured out a way to make their dirty rocks emerge as a low-carbon (or even a no-carbon) fuel source? Would coal still be in its precarious position?
Now, it’s not as if the coal industry has simply stood idle while other power sources ate its lunch. There are several large-scale efforts underway around the world aimed at allowing coal to burn more cleanly. The idea is generally to capture or reuse the carbon dioxide and other emissions released when coal is burned. The technology exists. But it currently costs a lot of money to do it, in part because carbon capture is still in the experimental stage. Still, it’s happening: A plant in Saskatchewan, Canada, the Boundary Dam Carbon Capture Project captures up to 90 percent of the carbon dioxide emitted in burning coal—which is then injected into nearby oil fields to improve production. There’s the Kemper plant in Mississippi, which is being supported in part by federal funds, that is designed to capture up to 65 percent of emissions and send the carbon dioxide to oil fields. But it is running massively over budget. A similar project is in the works in Texas.
Carbon-free or carbon-reducing technologies, including clean coal technologies, are always really expensive and kludgy at first, especially compared with the established way of doing business. The first modern electric car had to be an expensive luxury vehicle. The first solar panel systems and wind farms built in this country were uneconomic—both the panels and turbines were exorbitantly expensive, as was the design and construction of the systems. Why? Engineers and project managers had to design processes on the fly. The supply chains serving them lacked the scale and volume of production to bring costs down.
In many industries, the 1.0 iteration is generally expensive and not particularly functional. But then you apply the lessons learned to make the 2.0 version more compelling and cheaper. In the meantime, the underlying technology improves. As the market grows, more competitors enter, which spurs price reduction and further breakthroughs. Higher volumes of orders lead to more scale, which turns expensive specialty products and devices into cheaper commodities. Iterate through that process a few times, and you get a functional industry. That’s precisely what has happened with solar panels, wind turbines, and fracking in the last several years.
This process—moreso than any government war on coal—has made life difficult for coal producers. As rivals have made quantum leaps in efficiency and cost, coal has effectively stood still. Solar and wind and fracking had to leap. Coal didn’t have to, so while the market price of coal may be cheaper now than it was a few years ago, it doesn’t burn much cleaner. In contrast, in some parts of the country today the cheapest electricity a utility can buy is produced by solar.
Now, here’s a thought experiment. Between 2009 and 2015, the U.S. coal industry produced more than 7 billion tons of coal. What if the coal-producing industry in 2009 decided to tax itself—say, $2 for every ton mined—and then plowed that money into demonstration projects, carbon capture research, infrastructure, and subsidies for next-generation coal plants? What if it had invested $14 billion in efforts that would allow coal to be burned with no emissions, or with dramatically fewer emissions?
Doing so would not have guaranteed success, or some miraculous breakthroughs. But I’m reasonably sure that we’d be on the third and fourth generation of carbon capture projects, instead of the first; that the costs of building new carbon-capture projects would be far lower than they are today; that researchers would have developed useful new procedures and equipment; that we’d have an industry that knows how to construct such projects on budget instead of incurring massive overruns.
Of course, this would have required the sort of collective action and foresight of which few industries are capable. But had the coal industry done so, it would have had the possibility of positioning itself as part of the solution to lower emissions, rather than as the bulk of the problem.
You could argue that it’s not the miners’ responsibility to ensure that the use of their resource doesn’t produce negative externalities. They’re not the ones who buy and burn the coal, after all. But that’s not why they should’ve done it. As we’ve seen, the industry’s sole customers—electric utilities—now have choices in how they meet demand for electricity. They can convert plants to run on natural gas, or build large-scale renewables, or focus on efficiency and storage instead of production. They could pull plenty of levers if they want to cut emissions.
The coal industry, which had the most to gain from investments in clean-coal technology, has run out of levers to pull.