On a typical day in 1844, the average adult Irishman ate about 13 pounds of potatoes. At five potatoes to the pound, that’s 65 potatoes a day. The average for all men, women, and children was a more modest 9 pounds, or 45 potatoes. If you want to understand the devastation wrought by the notorious fungus Phythophthora infestans, you must begin with those astonishing numbers—numbers that led one 19th-century traveler to observe that “the Englishman would find considerable difficulty in stowing away in his stomach this enormous quantity of vegetable food, and how an Irishman is able to manage it is beyond my ability to explain.”
The fungus arrived mysteriously in the fall of 1845. Within a year, potato output had fallen by half, and the newspapers were filled with accounts of gruesome starvation. In another year, output fell by another 80 percent and starvation was no longer newsworthy. By the end of the decade, roughly a million—maybe 12 percent of the population—had perished.
Tragically, the effects of the Great Blight were multiplied by a great (and entirely understandable) human error: It was widely assumed to be temporary. Throughout the first year, even as hunger became rampant, farmers continued to plant potatoes in anticipation of a better harvest. As it turned out, planting potatoes was pretty much tantamount to throwing them away: Although the 1845 harvest was half the norm, Irish farmers planted the same number of potatoes in 1846 as in 1845 (30 percent of the previous year’s crop instead of the usual 15 percent). Had the Irish eaten more of that planting crop, some of the 1846 starvation could have been alleviated. Also, the starvation of later years could have been alleviated had farmers started switching to other crops.
The Irish potato famine teaches us the importance of distinguishing between temporary and permanent catastrophes. These lessons are underscored by the Great Famine research of the University of Chicago’s professor Sherwin Rosen. But Rosen’s work also demolishes one of the great fables of economics classrooms, shared by generations of economics professors with their students. Here’s how the fable goes: Imagine a 19th-century Irishman who eats lots of potatoes and a little meat. As potatoes become scarcer and their price starts to rise, our Irishman’s budget is strained to the point where he is led to cut back on meat and demand even more potatoes. The heightened demand pushes the price up further and we’re off on a vicious circle.
That’s a great parable for the economics classroom because it illustrates that contrary to most people’s expectations, rising prices can in principle lead to more demand, not less. (In the jargon of the profession, goods that behave that way are called “Giffen goods.”) The problem with the parable, Rosen points out, is that while it makes perfect logical sense as a possibility, that’s not the way things happened. In 19th-century Ireland, as in every other time and place we know of, rising prices dampened demand.
Rosen’s observation isn’t new. When telling this Great Famine story in classrooms over the last 20 years, I’ve always expressed the caveat that it probably isn’t true—although there are important theoretical lessons to be learned from the fact that it could have been true. But Rosen’s new empirical evidence pretty much removes all doubt.
If there’s another economic lesson to be learned from the Great Famine, it’s that it pays to diversify. Near-total reliance on a single crop—whether for production or for consumption or, as in the Irish case, for both—invites near-total disaster. The likelihood of that disaster might be small (and surely nothing in prior Irish experience suggested the possibility of anything like Phythophthora infestans), but its potential magnitude makes it worth planning for.
Of course, nowadays we’ve learned that lesson. We all know that we should avert disaster through diversification—right? Well, maybe. But plenty of modern Americans choose to invest in the companies they work for—so that a single downturn in that one company’s fortunes can cost them both their savings and their jobs. And plenty of others invest in companies located in their own hometowns—so that a localized recession can hit both their stock portfolios and their real-estate values simultaneously.
Of course no modern American with a stock portfolio lives as close to the edge as a 19th-century Irish potato farmer, but still the past reaches out to teach us new ways of seeing the present. According to last year’s Nobel laureate James Heckman (together with his students James Cawley, Lance Lochner, and Ed Vytlacil), American workers in the 1980s appear to have made the same mistake as Irish farmers in the 1840s—they thought a permanent change was only temporary and therefore responded inappropriately.
The permanent change that began in the 1980s was a widening wage gap between skilled and unskilled workers. Many skilled workers, thinking the change was temporary, saw a brief window of opportunity to earn premium wages and threw themselves into their work. At the same time, many unskilled workers, making exactly the same mistake, figured that if they were ever going to take some time off, they’d better hurry to do it before their wages went back up. As a result, the gap in earnings widened even further.
The analogy is imperfect: Nineteenth-century Irish potato farmers thought the famine was temporary and therefore underreacted (continuing to plant as before), while 1980s American manufacturing workers thought the widening wage gap was temporary and therefore overreacted. In both cases, inappropriate reactions magnified the effects of the initial surprise.
One moral is that it’s better not to make mistakes than to make mistakes. But of course we already knew that. The deeper moral is that one particular kind of mistake—a confusion between what is temporary and what is permanent—can explain a lot of human history.