When Seattle became the first major American city to pass a $15 minimum wage in 2014, a lot of temperamentally cautious wonks—myself included—saw the move as a wild, possibly ill-conceived experiment. Would it kill jobs? Would it turn the whole town into a misty no man’s land for entry-level service workers? Who could say?
Since then, of course, plenty of other states and cities have decided to join Seattle’s adventure. Last Tuesday, Illinois became the fifth state to enact a $15 pay floor—it will phase in by 2024—following California, New York, Massachusetts, and New Jersey, as well as Washington, D.C. Maryland is currently debating its own bill, while other states have passed significant hikes that don’t quite hit the $15 mark. (The federal minimum is still sitting at $7.25 an hour.)
It remains entirely possible that, one day, economists will look back on this burst of legislation as a series of well-intentioned mistakes. But thanks to some recent studies, I’m feeling a little less apprehensive about how it will play out.
First, some background. Business groups and conservatives have long insisted that forcing employers to pay their workers more would lead them to hire less. And for years, the economics literature more or less agreed. But starting in the mid-1990s, a new line of research emerged that suggested minimum wage increases actually caused few if any job losses at all. The academic debate about this issue has been fierce, extremely technical, and will probably continue right up until Antarctica melts and the rising seas swallow every last KFC. But by 2014, the conventional wisdom among center-left policy experts was that moderate increases in the pay floor were basically fine for the job market.1
Seattle’s move to $15 was anything but moderate. The new minimum, which was set to phase in over several years, was high—if not quite unprecedented—by both international and historical U.S. standards. And it gave economists a chance to see how the labor market would absorb such a dramatic increase.
The early results were not encouraging. In 2017, a group of economists from the University of Washington concluded that Seattle’s law had cost the city thousands of jobs. It also led businesses to pare back hours for their employees, and as a result, the researchers reported that the average low-wage worker earned $125 less per month than if the ordinance had never passed. While an updated version of their paper later lowered that estimate to a loss of $74 per month, it appeared that the ordinance had backfired. This was especially worrisome given that the authors’ data extended just through 2016, when the city’s minimum had only risen to $13 for large employers. It seemed possible that as wages continued to increase, the damage would deepen.
But as more information rolled in, the picture became a bit fuzzier and less discouraging. Late last year, the University of Washington team released a new paper showing that Seattle’s rising pay floor appeared to benefit low-wage workers, as long as they already had a job when it took effect. Those with relatively more experience worked fewer hours as a result of the law, but earned about $19 more per week on average—in part because they were paid more per hour, and because some appeared to pick up shifts outside of Seattle proper. Employees with relatively less experience worked fewer hours, but more or less broke even.
Most people would probably agree that working less and earning the same pay, or more, is a happy outcome. Some of these employees may have purposely cut back their hours at work so they could spend more time with family or otherwise tend to their lives. But the authors suggested that the new minimum still had a downside because it might have made finding a job harder for new low-wage workers (meaning people who hadn’t been employed in Washington during the previous five years). They showed that, until mid-2014, Seattle and the rest of Washington state outside of King County added new workers who earned less than $15 at roughly similar rates. After then, they diverged, as Seattle’s pace slacked off.
Between their two papers, the University of Washington economists arrive at the following conclusion: “Seattle’s minimum wage ordinance appears to have delivered higher pay to experienced workers at the cost of reduced opportunity for the inexperienced.” Critics, though, have attacked their methodology, and suggested there’s another straightforward explanation for why low-wage jobs are disappearing from Seattle: The city is in the midst of a tech boom and workers are simply finding better, higher-paying opportunities. The University of Washington team says that reasoning makes little sense, however, because the decline in low-wage employment they observe starts exactly at the start of 2016, when the top minimum wage jumped to $13 and seasonal employment in Seattle is typically at its winter nadir (meaning high-paying opportunities wouldn’t necessarily abound).
Like all academic debates about the minimum wage, the particulars of this one will probably go in circles for a while. But the bottom line is that the most pessimistic review so far of the Seattle experiment has changed from “this is a total flop” to “this policy presents tradeoffs that make life more bearable for experienced workers, at the expense of new folks just entering the workforce.” It’s gone from one-star to two-and-a-half or three.
Seattle is also, in the end, just a single city—and one with a unique economic situation that presently is making it hard to draw any meaningful conclusions from the labor market experiment it has decided to run. Meanwhile, a working paper released earlier this year, which looks at many more minimum wage laws over time in different parts of the country, offers more reasons for optimism. The authors used state-of-the-art methods to study the effects of 138 different state-level minimum wage hikes that took places between 1979 and 2016 on total employment.2 They found that, overall, increases have little impact on jobs, except in tradable sectors such as manufacturing, where increases seem to hurt more. (A lot of American factory jobs pay fairly low wages these days, and unlike fast-food gigs, they can be moved pretty easily over state or international borders.)
Just as importantly, the paper suggests that it may be safe to hike the minimum wage further than some policy experts once thought. When economists try to tell whether a pay floor is reasonable for a city or state, they often compare it to what local workers typically earn in the region. After all, businesses in expensive, high-wage metropolitan areas like San Francisco or New York can probably afford to pay their workers more than businesses in Little Rock, Arkansas, or Dubuque, Iowa, if only because they can charge higher prices. In 2014, University of Massachusetts economist Arindrajit Dube, one of the leading researchers on this topic, took a look at international and historical U.S. averages and concluded that it was probably safe to set the minimum wage somewhere near half the median hourly pay of a full-time worker. It was just a rule of thumb, but a useful one.
This new study, which Dube co-authored, suggests that the rule of thumb could go a little higher. It looks at pay floors ranging from 37 percent of a states’s median wage to 59 percent. Of the 138 increases studied, 42 pushed the minimum above the 50 percent mark. These included laws passed in high-wage states like Washington, California, Oregon, and Vermont as well as low-wage states like Arkansas and West Virginia. And none of them seemed to do significant damage to the job market. “Overall, these findings suggest that that the level of the minimum wage increases in the U.S. that we study have yet to reach a point where the employment effects become sizable,” the paper concludes.
Again, it’s possible that Illinois, or New York, or New Jersey, is now pushing the limits of a sound minimum wage policy. But the latest results from Dube and his collaborators suggest those states might be taking less of a risk than it once seemed.
1 An analysis by the nonpartisan Congressional Budget Office illustrated one reason why. In line with mainstream conservative estimates, the CBO forecast that raising the federal minimum wage from $7.25 to $10.10 would reduce employment by 500,000 jobs. But even then, low and middle class families would still earn an additional $19 billion in wages. So, on net, working households benefitted.
2 The paper’s authors rely on a “bunching” method, in which they look at the change in the number of jobs just above or just below the new minimum wage, which lets them determine whether patterns they’re observing in total employment are likely to be driven by the law, or other things happening in the economy.