Why Quitting Is Underrated

And grit is not always a virtue.

A hamster walking away from its hamster wheel.
Getty; The Atlantic

Siobhan O’Keeffe, one of tens of thousands of runners in the 2019 London Marathon, noticed that her ankle started hurting four miles into the race. According to a news report at the time, she kept running despite the worsening pain. Another four miles later, her fibula bone snapped. Medics bandaged her leg and advised her to quit, but O’Keeffe refused. She actually finished the marathon, running the last 18 miles in nearly unbearable pain and risking permanent injury.

Running 18 miles on a broken leg stretches the limits of believability. Even her orthopedic surgeon remarked as much. But what might be more unbelievable is that this story is not uncommon. In fact, that same day, at the same distance into the race, another runner, Steven Quayle, broke his foot. He, too, kept running, through pain so bad that during the final 10 miles, he had to make several stops for medical assistance. But like O’Keeffe, he finished the race.

In professional poker—my former field—knowing when to quit is a survival skill that separates elite players from the rest of the pack. Yet, despite the obvious virtues of folding a bad hand, in most areas of life human beings tend to extol perseverance, so much so that a quick Google search turns up many other stories of distance runners around the world suffering horrifying injuries mid-race but refusing to give up. We look at these types of stories and think, I wish I had that kind of grit.

But is grit a virtue when we stay too long in bad relationships, bad jobs, and bad careers? Much of the commentary on the COVID-era Great Resignation seemed to judge the workers who were quitting in droves—as if millions of people were losers for walking away, during a global health crisis, from jobs that they didn’t want to do. Meanwhile, workers who are “quiet quitting”—that is, staying in a job they no longer like while doing the minimum necessary to hold on to it—get a sympathetic hearing in many quarters.

The misguided urge to persevere—even when that perseverance is half-hearted at best—isn’t restricted to individuals. Businesses stick with high-profile hires who aren’t working out and continue offering products that are clearly failing. Nations spend years, sometimes decades, throwing money and human life into unwinnable wars.

This is the downside of grit. Though grit can get you to stick to hard things that are worthwhile, grit can also get you to stick to hard things that just aren’t worth sticking to—such as the remainder of a marathon after your fibula snaps at mile eight.

In 2013, the economist Steven Levitt, a co-author of the best seller Freakonomics, put up a website inviting users to flip a virtual coin. You might be skeptical that anyone would use such a tool to help them decide anything. But over the course of a year, more than 20,000 people actually did this, including about 6,000 who were considering a serious matter such as quitting their job, retiring from the workforce, or ending a relationship.

Levitt reasoned that, if these were truly such close calls that relying on a coin flip seemed like a good option, the people who stuck with the status quo were likely to be as happy as those who left their job or their partner.

But when he followed up with the coin flippers two and six months later, he found that the quitters were happier, on average, than those who persevered. While the decisions may have felt close to the people making them, they weren’t actually close at all. As judged by the participants’ happiness, quitting was the clear winner. That meant that they were getting to the decision too late, long after it was actually a close call.

Nearly half a century of scientific research has identified a host of cognitive forces that make us put off quitting. The most well-known is the sunk-cost fallacy, first identified as a general phenomenon by the economist Richard Thaler in 1980. It’s a systematic cognitive error where people take into account money, time, effort, or any other resources they have previously sunk into an endeavor when making decisions about whether to continue and spend more, throwing good money after bad. The fear of wasting what we’ve already put into something causes us to invest more in a cause that’s no longer worthwhile. (Thaler later won a Nobel Prize for his research in behavioral economics.)

Another commonly known error that keeps people from quitting is status quo bias, introduced in 1988 by the economists Richard Zeckhauser and William Samuelson. When comparing two options, both individuals and companies overwhelmingly stick with the one representing the status quo, even when it is demonstrably inferior to the option representing change. An employer is more likely to keep a middling performer on the roster for too long than risk hiring a worse replacement. Likewise, an employee will stay at a miserable job because it’s the status quo, rather than quit to find a better one. We prefer the devil we know.

Decision makers in professional sports get a lot of continuous, quick, and clear feedback on player productivity. There are objective measures of player performance, and data are constantly updated. The coach and team management are highly motivated—both by financial reasons and their own competitive drive—to deploy the best players in order to win. Yet even NBA owners and coaches stick with their own bad decisions.

In 1995, the social psychologists Barry M. Staw and Ha Hoang looked at the results of the NBA drafts from 1980 to 1986. They asked a simple question: Does a basketball player’s draft order—independent of their subsequent performance on the court—affect their playing time, likelihood of being traded, and career length?

Staw and Hoang concluded that “teams granted more playing time to their most highly drafted players and retained them longer, even after controlling for players’ on-court performance, injuries, trade status, and position played.”

As a competitive strategy, this makes no sense; a high draft pick who plays no better than a lower-round pick deserves no more time on the court. But this is where you can clearly see the effect of cognitive errors like the sunk-cost fallacy. Spending a high draft pick to acquire a player burns a valuable, limited resource. Benching or trading or releasing such a player, despite performance data justifying it, feels tantamount to wasting that resource, so those players get a lot more chances than players drafted lower who are playing as well or better.

These findings can’t be dismissed as a relic of the pre-Moneyball era. The economist Quinn Keefer has conducted several field studies since the mid-2010s on the effects of draft order and player compensation on playing time in the NFL and the NBA. Although the effect sizes were somewhat smaller than in the 1995 study, they were still significant.

If professional sports teams, with their armies of analysts and constant pressure to win, keep dragging out their own misjudgments, what’s happening in our everyday lives? Which relationships are we staying in too long? Why are runners finishing a race with a broken leg? Why are employees “quiet quitting” instead of just quitting?

We fear that when we quit we are admitting failure—that we have wasted our energy. But we need to start thinking about waste as a forward-looking problem, not a backward-looking one. That means realizing that spending another minute or another dollar on something that is no longer worthwhile is a far bigger waste than whatever we have already invested.

Contrary to popular belief, winners quit a lot. In fact, that’s how they win.


This article has been excerpted from Annie Duke’s new book, Quit: The Power of Knowing When to Walk Away.


​When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

Annie Duke, a former poker player, is the author of Quit: The Power of Knowing When to Walk Away.