Future Tense

What’s Stopping Human Capital From Becoming a Security?

A response to Mark Stasenko’s short story “Overvalued.”

Two ceramic heads, one with money in the slot and the other cracked open with a hammer, revealing many dollar bills.
Doris Liou

An investor and writer responds to Mark Stasenko’s “Overvalued.”

During the great financial crisis of 2008–09, millions of people became painfully aware of financial derivatives, which had until then occupied a highly lucrative but obscure corner of finance. Starting in the 1990s and then expanding exponentially in the early 2000s, financial derivatives were supposed to decrease risk by allowing traders to buy and sell not just future movements of stocks but also mortgages, commodity contracts, bonds, and just about anything that could be sold on electronic exchanges. Instead of spreading out risk and making markets less volatile, however, the explosion of derivatives had the opposite effect, turning them into “financial weapons of mass destruction,” as Warren Buffett famously called them in 2002. In the worst moments 10 years ago, it seemed as if his warnings would come all too true.

Since then, financial derivatives have once again receded from public consciousness, but they certainly haven’t gone away. As the number of derivatives has proliferated, so too have software programs that dictate trades. The massive increases in computing power and now artificial intelligence mean that on any given day, algorithms are trading with algorithms more than humans are trading with humans. New electronic exchanges have popped up to facilitate both the volume and speed that these derivatives and algorithms demand.

As financial markets have once again been chaotic and roiling in recent weeks, it’s perhaps an opportune moment to ask whether the calm of the past few years has lulled us. At the height of the financial crisis and for several years after, many investors and companies paid more attention to how technology might be distorting markets and making them more vulnerable to manipulation. Years of steady gains shunted those questions to the back burner, but it’s time to focus on them again. Perhaps the future holds not just more computer-driven flux but new products that will allow people to trade and sell derivatives not just of financial instruments but of various aspects of life as well.

That’s the specter raised by Mark Stasenko’s macabre short story of a not-too-distant future in which the potential of an individual has been turned into a tradeable security via a Prodigy Market in which investors can buy, sell, or short promising people.

Elements of the story are already real. Insurance companies have for many years insured vital aspects of individual talent and worth—Lloyd’s of London has famously insured Betty Grable’s legs and Bruce Springsteen’s voice. Enron (remember it?) briefly became a multibillion-dollar company by cornering the energy markets and aggressively trading derivatives while attempting to rig future prices to its advantage. In the past few years, a number of Silicon Valley startups have formed to invest in the loans of promising graduates and undergraduates, including SoFi (which began with the presumption that a Stanford student represents a better credit risk than a general pool of students and so should have a higher credit rating and a different risk profile and price) and Upstart (which privileges students from better schools with better credit scores). A slew of other startups have attempted similar forms of financing, pooling assets to invest in select baskets of individuals by school and/or career. Most have failed, but if the past is any prologue, these will be harbingers of things to come.

And so the conceit behind “Overvalued” is hardly science fiction. If anything, we are closer than we think to a world where human capital becomes a security to sell, package, and even short. Already, sports betting odds adjust dynamically to prospective new recruits and to those prospects then getting injured. It would hardly be a great leap to extend that principle to promising law students or nascent MBAs. Kickstarter already allows individuals to invest in other people and their dreams, so how much of a leap will it be to invest in someone’s future earning potential and then be able to “price” people in real time based on their grades or test scores or performance evaluations?

Stasenko’s dystopian vision of a Prodigy Market is, in fact, akin to what the Chinese government is currently attempting to institute on a mass scale, with every citizen assigned a “social score” that will determine everything from credit lines to job interviews to travel privileges. The metrics the state might use to ensure conformity and continued control may be somewhat different from those used by a hedge fund to determine profitability, but they have more in common than not. They also are now made possible by the vast amounts of data each of us leave of our daily lives on social media or as a byproduct of electronic transactions ranging from banking to Amazon purchases to booking travel and paying bills. We—all of us who have smartphones and bank accounts—leave a data trail that easily translates into a score that could be used for a range of purchases from the benign (points and perks) to the alarming (having our scores bought and sold and driven lower by financial intermediaries).

But while we are closer than we might think, we are still further than we fear. For one, social norms are not there yet. We might be willing to slice and dice all sorts of financial instruments and drive businesses to failure in order to make a buck by betting against them. But we are not there yet with human lives, at least not explicitly. China’s social score is getting lots of attention, but there remains a considerable gap between what the government might dream of imposing and what it can actually do as of now. Those gaps, between what is possible and desirable, between what is acceptable and as yet not, matter more than the technologies that shrink them.

The fear today is that technology is soon to dehumanize us and rob us of agency. That has been the fear of new technologies for generations, and this time, it may finally come true. Yet until it has, it hasn’t, and for now and for a considerable time ahead, there are apparently some lines that most humans will not cross even if they can.