The Associated Press reported on Tuesday that Cass Sunstein, the legal scholar and former White House regulatory czar, is writing a book about Star Wars. Details are scarce: The article notes only that the book will be an “exploration” of Star Wars and that “Sunstein will touch upon everything from history to politics to fatherhood.“ But looking at Sunstein’s interests, as well as some of the most Sunstein-esque mysteries from the original trilogy—since everyone knows the second trilogy never happened—offers some hints at what could be in there, or what should be in there, at least.
Sunstein, a law professor at Harvard, has written articles and books on just about everything during his prolific academic career. During his time in the White House and in published work from the last few years, though, he’s taken a keen interest in the insights of behavioral economics, a field concerned with better understanding human decision-making and the biases that can lead it to unfortunate results. Sunstein is a big proponent of “nudges”—unobtrusive, behavioral-econ-informed interventions that can help encourage people to make better decisions without forcing the issue (putting the desserts in a slightly harder-to-reach place in a cafeteria, for example, and laying the fruit out in front of them), and he brought this enthusiasm with him to the White House. In books like Nudge: Improving Decisions About Health, Wealth, and Human Happiness, which he co-authored with the pioneering behavioral economist Richard Thaler, he’s dug deep into the science behind these issues. Sunstein is also very interested in the related question of how governments and other large organizations can function better, more efficiently, and with a smarter approach to cost-benefit analyses, subjects he’s tackled in Simpler: The Future of Government and Wiser: Getting Beyond Groupthink to Make Groups Smarter.
To me, all of this points in a clear direction. What’s the one big government entity in the original Star Wars trilogy? The Empire. And does the Empire seem to fall into some potentially preventable traps of poor decision-making? Yes, indeed! I’d argue, then, that Sunstein should look at the following three questions from the original trilogy.
1. Did the sunk-cost fallacy lead the Empire to wrongly double down on Death Star technology?
We all know the basic story line: In the first entry in the trilogy, Luke’s “one-in-a-million” shot destroys the Death Star, the Empire’s giant, moonlike “orbital battle station.” In Return of the Jedi, it’s revealed the Empire is working on constructing a second Death Star … only to have the Rebels destroy that one, too—Lando Calrissian gets the kill-shot this time—in the film’s climactic moment.
Clearly, something went wrong here. Given the size and scale of the Death Stars, the Empire spent untold trillions of credits on not one but two massively complex pieces of military hardware that weren’t up to the task of crushing the rebellion, both of them suffering from critical design flaws that left them vulnerable to even a much smaller, under-supplied adversary. The results were catastrophic, leading not only to embarrassing, strategically disastrous military defeats, but to significant loss of life among those who staffed both battle stations (including the numerous independent contractors likely onboard the second time around).
How to explain such a colossal, galactic screw-up? Sunstein might want to view this scandal through the lens of what’s known as the sunk-cost fallacy, a key theory in economics that behavioral economics has helped shed a lot of light on. The basic argument is that once money (or any other research) is gone, it’s gone—we shouldn’t let it affect our future decision-making. And yet, due to a quirk of human psychology, we tend to do just that. Let’s say you spend $100 on a ticket for a concert you can only get to by driving, for example, but a terrible snowstorm hits the day of the show. All things being equal, you’ll be more likely to brave the weather than you would be if the ticket had been gifted to you. But the money’s gone either way and shouldn’t factor into your evaluation of the risk of skidding off the interstate. The sunk-cost fallacy comes up in policy-making arguments all the time, perhaps most frequently in military concerns: You’ll frequently see people argue that because of the amount that has already been spent on a given fighter jet in development, or the lives already lost in a given war, it would be wrong to withdraw from these endeavors.
So if the Empire spent some ungodly sum on the technology undergirding the first Death Star, only to find it blown into a million pieces by Skywalker, it’s hard not to imagine what would happen in the subsequent boardroom meetings: There would be huge pressure to stay the course on this whole Death Star thing, despite the fact that the project had, in the very meaningful sense of “not existing anymore because it got blown up,” failed entirely.
Heading in another direction, in other words, would be a very reasonable option. But try to picture a mid-level bureaucrat—one who really only wants what’s best for the Empire—piping up during one of these meetings: “I’m not trying to upset the apple cart here, guys,” he might say, “but are we sure we want to build another Death Star? Have we even fully costed this out? By my back-of-the-envelop calculations, for the same number of credits as building Death Star 2.0, we could build an entire new fleet of Star Destroyers.” The response would be furious: “So you’re saying we should just turn our back on a project that already cost us trillions of credits? Puh-lease.”
To which the correct, Sunstein-informed answer might very well be: “Yes, that’s what I’m saying.” But it’s a hard case to make because of the powerful sway of the sunk-cost fallacy.
2. Does the threat of a Force-choke make Imperial employees more or less effective?
Behavioral economists are big into the idea that our decisions are shaped by forces largely invisible to us, that all too often we accept the status quo of a given office or school’s established procedures and rules without investigating the subtle ways those procedures and rules might lead to subpar outcomes. One example of this that could be highly pertinent to the Empire is Force-choking.
As anyone who has seen the original trilogy knows, Darth Vader was not shy about Force-choking Imperial soldiers and employees who screwed up on the job. To take likely the best-known example, he killed Admiral Kendal Ozzel after Ozzel committed a strategic blunder that complicated Vader’s plans in the Hoth system. Surely, news of this got around, and surely, given the frequency with which Vader Force-choked his underlings, it’s hard to imagine that this threat didn’t affect the way the Imperials under Vader went about their work.
But was it an effective workplace tactic? There’s some reason to think it might not have been. As any behavioral economist will tell you, humans are naturally loss-averse—we focus more on potential losses than gains. It’s hard to imagine a bigger loss than dying as a result of a Force-choke administered in front of your helpless friends and co-workers, so there’s a possibility that this fear could have rippled through the organization, leading Imperial employees to become less daring and more risk-averse in general—focusing less on doing a good job, in other words, and more on not doing a bad job. In an organization where there’s stifling incentive to not be the fall guy, everyone will be racing to blend in and keep their heads down rather than offer up innovative ideas on how to crush the latest Rebel uprising or deploy the Conqueror more ingeniously.
This is just one possibility, though. As Sunstein would undoubtedly point out, the only real way to gauge the effectiveness of Force-choking punishments on the broader organization would be to run what’s known as a randomized controlled trial (RCT)—this is seen as the gold-standard way to test whether a given behavioral intervention actually works, and Sunstein is a fan.
Here’s one possibility: Take two groups of Imperial employees whose day-to-day work is similar, and tell one that for the next six months no one will be getting Force-choked—rather, if they screw up they’ll get a reprimand or some other form of milder, non-life-threatening punishment. The other group is the control group: For them, it’s business as usual, Force-chokes and all. Over the next month, track both groups’ performances and test for statistically significant differences, and maybe pull them into a conference room once or twice to ask them to come up with new ideas, and voilà—you have evidence (albeit not dispositive evidence, since no single RCT can tell the whole story) about Force-chokes’ effects on employees’ performance, both in terms of their day-to-day effectiveness and their ability to come up with new ideas. (Note: An experiment like this would not get past a real-world ethics board, but for the sake of this exercise we can assume the Empire takes a more laissez-faire approach to human research.)
3. What led to the Empire’s disastrously overconfident approach to the Forest Moon of Endor and its native Ewoks?
The most important domino to fall in the run-up to the destruction of the second Death Star took place on the Forest Moon of Endor, where a small group of Rebels teamed up with Ewoks to fend off Imperial forces and blow up the shield generator that protected the Death Star. After that, all that separated the Rebels from a giant celebratory jungle party with their furry friends was some of Lando’s sharpshooting.
Whereas the organizational failures that led to the construction of the second Death Star in the first place may well have been, as I’ve argued, of the opaquely bureaucratic variety, here the Empire’s hubris is on full display. Given the vital importance of protecting the Death Star, why would the Empire have built a base with an easily accessible backdoor that granted access to the shield generator (a weakness easily exploited by C-3PO)? Why would it have left that base so lightly guarded? Why would it have not sought a better understanding of the Ewoks’ considerable insurgent capabilities, especially given that they enjoyed the advantage of familiar, densely forested terrain?
Examined retrospectively, it all seems insane. But as students of organizational failure like Sunstein know all too well, these sorts of bungle cascades can occur quite easily in institutional settings because of another unfortunate quirk of human decision-making, one specifically cited by Sunstein and Thaler in Nudge: an “optimism and overconfidence” heuristic. That is, all things being equal, we overestimate our abilities and are more confident than we should be that we’ll pass that exam or get that promotion or protect that shield generator.
What likely happened, and what Sunstein will hopefully investigate in his book, is that the few dissenters who pointed out the security vulnerabilities of the Endor-moon setup were shouted down by louder, more confident voices. We’re drawn to confident figures more so than worrywarts and those who easily embrace ambiguity, and in institutional settings groupthink only exacerbates the problem: As Sunstein notes in Overconfidence, which he co-authored with Reid Hastie, group members “show more unrealistic overconfidence” than do individuals acting and thinking alone. So the Empire likely failed to protect the shield generator because everyone knew—or everyone whose voice really mattered knew—that it was completely safe on the Forest Moon of Endor with a small regiment of Stormtroopers attached to it. Those with doubts had little incentive to pipe up given the optimistic folks sitting to their left and right (though it’s also possible that this was a case of so-called pluralistic ignorance—everyone thought everyone else was confident and conformed their own publicly expressed views accordingly, when in reality almost everyone knew the plan was dumb but didn’t want to stick their necks out).
What’s particularly tragic about this incident—if you side with the Empire, at least—is how easily it could have been prevented. There’s a decent base of research suggesting that a simple task can go a long way toward defusing the worst excesses of our overconfidence and groupthink: Ask people to sit down and specifically discuss all the ways a given initiative could go wrong. All Vader or Emperor Palpatine had to do was take a group of their brightest minds, sit them in a room, and say, “Look, guys—we know it’s unlikely the Rebels are gonna swoop in and knock out the shield generator. But for the sake of argument, take the next two hours and come up with as many scenarios as you can imagine in which it does happen. Really think outside the box. There are no wrong answers, and regardless of what you come up with, no one will be Force-choked.”
This one simple exercise could have given rise to a better understanding of the second Death Star’s vulnerabilities. Alas, there was no equivalent to Cass Sunstein on the Imperial payroll.
See also: Star Wars Franchise Will Never End