The idea that many Americans might be living on less than $2 a day once sounded almost impossible; that was supposed to be what poverty meant in the developing world, not a rich nation like the United States. But in a wildly influential 2013 study, two sociologists upended that assumption: Kathryn Edin and Luke Shaefer showed that by 2011, there were 1.6 million U.S. households with children that, for at least part of the year, had cash incomes that put them below the $2 per person mark. The number had risen from 636,000 in 1996—a 152.9 percent increase.
Edin and Shaefer later turned their paper into an acclaimed book, $2.00 a Day: Living on Almost Nothing In America, that documented the lives of families mired in this kind of extreme poverty. In a New York Times review, the famed sociologist William Julius Wilson called it “essential” and a “call to action.” Their work inspired follow-on research by Nobel Prize–winning economist Angus Deaton. Bernie Sanders took to quoting them on the campaign trail.
The pair’s findings on the growth of extreme poverty also became an important data point in the argument over the legacy of welfare reform. In the 1990s, President Bill Clinton teamed up with Republicans in Congress to end the federal government’s old, often vilified program that gave cash directly to poor families. They replaced it with a system designed to make unemployed parents go to work. Edin, today a professor at Princeton, and Shaefer, a professor at the University of Michigan, argued that this change destroyed a key part of the cash safety net, fueling the rise of $2-a-day destitution.
Were they right? Edin and Shaefer’s eye-popping results have always been somewhat controversial, with critics—typically on the political right—arguing that they’re mostly a statistical illusion created by bad data. The debate has gone on for years in the pages of academic journals and white papers, with Edin and Shaefer countering detractors and refining their findings along the way. This week, though, a group of rival researchers released the most serious challenge to their work yet. Their paper uses newly available data to essentially double check and correct the government survey results Edin and Shaefer relied on. And, after all of their adjustments, $2-a-day poverty nearly disappears.
It’s a convincing critique. But even as it makes a strong case that Edin and Shaefer’s most famous stats aren’t reliable, in some ways, the new study also supports the broader point of their book: The modern safety net seems to be fundamentally failing many families in America.
The U.S. Census Bureau considers a household to be living in “deep” poverty if their income drops below half the official poverty line—about $10,100 or less for a family of three last year. Edin and Shaefer chose to look at $2-a-day poverty in their research instead, because they sensed some families were suffering from even more crushing deprivation. Edin had spent her career studying the financial lives of the very poor—in the 1990s she carefully documented how mothers on welfare stayed afloat. While doing field work in 2010, she noticed that there seemed to be a growing number of shockingly impoverished households with no source of cash income at all. “These families weren’t just poor by American standards,” she and Shaefer wrote in their book. “They were the poorest of the poor.” The fact that they specifically lacked cash was important: Many parents might have received SNAP benefits, aka food stamps, that helped them put meals on the table. But that wouldn’t help them pay for gas to get to work or let them buy clothes for their children.
When Edin mentioned her observations to Shaefer, he decided to see if he could spot the trend of growing cashlessness in government data. But first, “he needed to determine what income threshold would capture people who were experiencing a level of destitution so deep as to be unthought-of in America.” He settled on $2 per person per day, the benchmark the World Bank had long used to delineate poverty in the developing world.
This did, in fact, lead to some previously unthinkable results. In their original paper, Edin and Shaefer found that in 1996, the year Clinton signed his welfare reform bill into law, 1.7 percent of U.S. households with children were subsisting on $2 per person per day or less of cash income for at least part of the year. By 2011, the share had swelled to 4.3 percent. Even when they counted government benefits that were similar to cash—meaning SNAP, housing assistance, and refundable tax credits—as income, the share of families experiencing $2-a-day poverty still rose, albeit more modestly, from 1.1 percent in 1996 to 1.6 percent in 2011. Cashless poverty really did seem to be on the rise.
Now, these households were not experiencing the conditions of a Nigerian village or a Mumbai slum. But without dollars to spend, they were scraping the very bleak bottom of American life. In their book, published in 2015, Edin and Shaefer introduced readers to young adults who sold their own plasma to survive and flitted in and out of homelessness. “Rae moved in with her friend Danielle, herself a mother of three, and a group of other childhood friends who were sharing a house on a street where virtually every other property was burned-out,” they wrote of one single mother, who, by her 20s, had managed to lose all of her teeth. “When the water and power were shut off in that house and a woman was raped in the abandoned garage next door, she decided it wasn’t a safe place to raise a child.”
This all fed into an emerging narrative about how Clinton’s welfare reform may have hurt some of America’s most vulnerable. Studies suggested that while the legislation may have helped reduce overall poverty a bit by nudging many single mothers into the workforce, the decision to cut some families off from a monthly check seemed to have increased the number of Americans living in deep poverty. By showing that $2-a-day poverty rose after 1996, Edin and Shaefer’s research added to the evidence that ending welfare as we knew it had left more Americans in truly dire need.
Their work got attention. It also got pushback: Critics started questioning the data they were working from.
Most of our statistics on poverty come from surveys conducted by the Census Bureau. The kink in this system is that when the government asks, the very poorest Americans don’t always give accurate answers about their finances. They may forget that they receive SNAP benefits, or mix them up with another program. They might neglect to mention the extra money they make working odd jobs. The upshot of all this underreporting is that some people at the very bottom of the economic ladder may look worse off in the official numbers than they are in real life.
This is a well-known issue. Edin and Shaefer tried to address it in their initial research by basing their analysis on the Survey of Income and Program Participation, or SIPP, which appeared to have less of a problem with underreporting than other datasets. They also pointed to other signs of extreme hardship that seemed to buttress their results, such as the growing number of SNAP recipients reporting zero income, or the growing ranks of homeless children in public schools. More recently, they’ve produced analysis showing that if you accounted for underreporting of benefits using commonly accepted techniques pioneered by the Urban Institute, $2 poverty still seemed to have grown.
But critics have continued to suggest that their viral stat—that the share of families experiencing $2-a-day poverty grew by more than 150 percent after welfare reform—is really a product of the poor failing to report all of their income and government benefits. And the new paper makes the most persuasive case I’ve seen yet. It’s authored by a team led by Bruce Meyer, an economist at the University of Chicago’s Harris School of Public Policy. Meyer is an expert on poverty data who is known for his pioneering research on underreporting in federal surveys. Because of this focus, his work tends to be embraced by conservatives, including the Trump administration’s economic team, but he is widely respected in his field.
Here’s what makes Meyer’s new draft paper, which he conducted with the Census Bureau, so attention grabbing. It cross-checks the responses Americans gave when they participated in SIPP—the survey Edin and Shaefer relied on—in 2011, against the government’s actual administrative records showing what those same people earned and the federal benefits they received. Essentially, Meyer and his colleagues use Social Security Administration and IRS data, among other sources, to fact-check people’s descriptions of their own financial lives.
What they find is that, for the most part, Americans who reported living on $2 a day of income or less seemed to have more resources at their disposal than they let on. Correcting their earnings to match the Social Security Administration’s records alone lifts 55 percent of them above the $2 level; it lifts 38 percent of them out of deep poverty; and it lifts a quarter of them out of poverty altogether. Add in Social Security benefits, other income reflected in their tax files, housing assistance, and SNAP, and nearly 80 percent make it out of $2-a-day poverty.

One frustrating thing about the paper is that, when it looks at families with children, it doesn’t define “income” in the same way Shaefer and Edin did, making its results slightly less reliable as a comparison. Whereas Edin and Shaefer analyzed how many parents were $2 poor if you only looked at their cash income and also analyzed how many were poor if you looked at their cash income as well as food stamps and some other government benefits, Meyer and his team always count the food stamps and other in-kind benefits as income. This makes it possible for Meyer to reduce the $2-a-day poverty rate to zero for parents. Using this approach with all Americans, they conclude the real rate is 0.24 percent, or about 285,000 households, most of which are single, childless adults.

There is another important difference in how Meyer and his team define the poor in their study. Specifically, researchers don’t consider any household to be in extreme poverty if they have significant assets (the fifth row on the chart above), such as $5,000 in liquid savings or $25,000 in home equity. That’s not unreasonable—after all, it’s hard to argue someone is living on $2 a day if they have significant money in the bank—but it makes it difficult to make an exact apples-to-apples comparison with Edin and Shaefer’s findings, which ignored savings.
Either way, the paper sharply undercuts the idea that $2-a-day poverty rose because of welfare reform. If there were virtually no $2-a-day poor families in 2011, their numbers couldn’t have increased much since 1996, since there just aren’t that many of them. (Meyer and his team ran a similar exercise using $4-a-day poverty as their benchmark, and came to similar conclusions.) At the very least, it makes the case that the data Edin and Shaefer relied on had serious shortcomings that make it difficult to draw hard conclusions about the poorest of the poor. In their 2015 book, the two authors argued that the mere fact that a rising number of families were claiming to be living without cash was a cause for concern. But given how much income government surveys seem to miss, it’s at best a guess whether that increase is a sign that the poor have less money, or they’re just reporting having less money over time. (Meyer has shown that underreporting of government benefits has gotten significantly worse over time, but the trend is less clear with income.)
One upshot of all this is that Clinton’s welfare reform may have done less to harm the poorest of the poor than many commentators, myself included, previously thought. That’s not a definite conclusion: It’s possible that other studies showing that deep poverty rose after the law passed are still correct. But the evidence that the 1996 bill led to serious hardship is weaker than before.
Meyer’s new paper is consistent with a different story, supported by other recent research: As the old cash welfare program disappeared, other parts of the safety net, most importantly food stamps, have stepped in and helped keep truly extreme poverty from growing. That should be a cautionary tale for conservatives who would like to reform other parts of the welfare state, like food stamps, by adding stricter limits and work requirements. The story here is that these parts provide essential help.
It’s also important to keep in mind a caveat about Meyer’s paper. Just because $2-a-day poverty might be a statistical artifact, that does not mean there aren’t a large number of families in truly dire circumstances in this country. Meyer and his team themselves write that while their “paper demonstrates that the rate of extreme poverty in the United States is substantially lower than what has been reported, we do not contend that there is little deprivation in the United States.” Their paper also may miss some of the worst cases of need; it excludes homeless people, for instance, who don’t show up in government surveys. Meanwhile, the Department of Education believes 1.35 million public school children lack homes.
In some ways, it even confirms that Edin and Shaefer were on the right track when they argued that families without cash ready at hand are prone to special suffering. Meyer still finds that the roughly 1 million households lifted out of extreme poverty by benefits like food stamps—a group that includes most parents reporting $2-a-day incomes—suffer material hardships at much, much higher rates than the normal poor. They’re more likely to miss a rent payment or a bill, to have their power or phone service cut off, or to have difficulty getting enough to eat. There are many people who get some help from the government but still say they don’t have enough cash to live on. The safety net doesn’t seem to be doing enough to catch all of them—even if their suffering isn’t captured neatly in a viral stat.