Young Daredevils

Children are just as willing as teenagers to take risks.

Two Boys Climbing on Tree
Some studies have found that children take more risks than teenagers and adults.

Photo by Fuse/Thinkstock

When people learn that I study risk-taking in children and adolescents, they often want to tell me about the crazy things they did when they were teenagers. I’ve heard about speeding tickets for driving nearly 100 miles per hour, wild parties while parents were out of town, and several instances in which various objects were set on fire. So far, no one has felt compelled to share stories about dumb stuff they did when they were children.

The notion of adolescence as a time of peak risk-taking is deeply ingrained in our culture. Yet many research studies, including my own, have found that children are also quite willing to take risks. In fact, a recent aggregate analysis of findings from more than two dozen laboratory studies has shown that children between the ages of 5 and 10 take risks at rates indistinguishable from those of adolescents between the ages of 11 and 19.

Some studies have even found that children take more risks than teenagers and adults. In one roulette-inspired task, participants bet on a “Wheel of Fortune” that was 70 percent blue and 30 percent pink. If participants correctly bet that the wheel would land on blue, they won $1. If they correctly bet that the wheel would land on pink, they won $2. Researchers found that risk-taking decreased with age from 9 to 40: Children more strongly preferred to bet on pink, the outcome that was less likely but had the bigger potential payoff.

In another gambling study, participants tried to win as many points as possible by making choices between sure wins (a guaranteed four points) and 50–50 gambles (win five points or win three points). The average payoff of the 50–50 gambles was always held equal to the sure win, but the spread of the gamble varied. For instance, win five or win three has a smaller spread than win eight or win zero, but both pay out four points on average. Increasing the spread decreased gambling in adults and teenagers: They were okay with gambling when they had an equal chance of winning five or three, but they preferred a sure four to gambling on eight or zero. Children, in contrast, increased their gambling rates as the spread of the gamble increased: They moderately liked gambling on five or three, but they really liked gambling on eight or zero. In other words, children were attracted to gambling on the possibility of bigger wins, while adults and teenagers were put off by the possibility of bigger losses.

In my own work, I have found that children are also surprisingly willing to take chances on the unknown. When offered the choice between a gamble with known probabilities (such as a 50 percent chance of winning $10) or a gamble with unknown probabilities (such as some unknown chance of winning $10), the majority of adults prefer the gamble that tells them their exact chance of winning. Adults’ aversion to taking ambiguous, unknown gambles has been repeatedly found in hundreds of studies. Yet when I offered 8- and 9-year-olds the same choice between gambles with known and unknown probabilities of winning, the children showed no discernible preference for either case. As Donald Rumsfeld might say, adults would rather face known unknowns than unknown unknowns, but children don’t mind the unknown unknowns.

These findings raise the question of why teenagers are known for their risk-taking, if children have similar or sometimes even greater propensities for risk. Unfortunately for researchers, behavior in the laboratory can be different from behavior in everyday life. Laboratory decision-making tasks are usually performed alone, while the real-world risky behaviors that teenagers are known for tend to happen in social settings. Adolescent binge drinking is generally a social activity, and unprotected sex requires at least one co-conspirator. One influential study found that teenagers’ behavior is especially sensitive to changes in social context. Compared to adults, teenagers ran more yellow lights and got into more car crashes during a simulated driving game—but only if their friends were watching them. When teenagers played the driving game alone, they took no more risks than adults did. In the real world, since graduated driver licensing laws have restricted how many passengers teen drivers can carry, the number of fatal teen car accidents has fallen.

Another key factor is likely the difference in access to dangerous activities. The free-range parenting movement notwithstanding, children today are, in many cases, more supervised than ever. Thus, children generally have fewer opportunities to get into trouble. As a recent New York Times Magazine article highlighted, however, there are plenty of children who will happily hop on motorcycles, clamber up cliffs, or skateboard off a nine-story MegaRamp if their parents let them participate in such extreme sports. And when children are left unsupervised, they are quite capable of taking risks, as evidenced by tragically familiar headlines about children taking the family car for a joyride or children who decide to play with guns, with devastating consequences.

It is important to note that taking risks is not always a bad thing. As researchers, we often jump to risk-taking’s most dire consequences to justify the value of our work, but the world would be quite static and stifling if everyone always played it safe. Sure, children who skateboard off giant ramps probably suffer more broken bones than children who stick to Tony Hawk video games. But children who ride real skateboards also learn to test their limits, conquer their fears, and pick themselves up after falling down. When children take risks, they reap potential rewards and consequences and, in the process, learn about themselves and the world.