The Supreme Court heard oral arguments this week in Roper v. Simmons, a juvenile death-penalty case in which scientific evidence highlighting differences between adolescent and adult brains may play a decisive role. As the court considers whether Christopher Simmons—who, in 1993 at the age of 17, abducted and killed 46-year-old Shirley Crook—should receive the death penalty, numerous medical and mental health associations, including the American Medical Association, the American Psychiatric Association and the National Association of Social Workers, among others have filed a brief on behalf of Simmons suggesting that older adolescents may not yet have the ability to exercise adult impulse control because their brains have not fully matured. To put 16- or 17-year-olds to death, the brief argues, would be “to hold them accountable … for the immaturity of their neural anatomy and psychological development.”
This claim, based largely on research conducted since Simmons’ original conviction, has been widely discussed—with little skepticism—in the popular press, making it difficult to assess its significance. What exactly are the differences between adolescent and adult brains? What neurological evidence has been marshaled on behalf of Simmons—and what are the limitations of this evidence?
First, let’s consider how the brain matures during adolescence. While significant growth occurs early on—the brain reaches 90 percent of its adult size by the age of 6—a second wave takes place in the years before puberty. During this time, gray matter—areas of the brain responsible for processing information and storing memories—increases in size, particularly in the frontal lobe of the brain, as a result of an increase in the number of synaptic connections between nerve cells. Around puberty, however, a winnowing process begins in which connections that are not used or reinforced begin to wither (hence the “use-it-or-lose-it” hypothesis). This pruning, which begins around age 11 in girls and 12 in boys, continues into the early or mid-20s, particularly in the prefrontal cortex, an area associated with “higher” functions such as planning, reasoning, judgment, and impulse control. As Dr. Jay Giedd of the National Institute of Mental Health has said, the real cognitive advances come with paring down or reducing the number of synaptic connections. During adolescence, the amount of myelin, a fatty, insulating material that coats the axons of nerve cells—similar to the way insulation coats a wire—also increases, improving the nerve cells’ ability to conduct electrical signals and to function efficiently; this too continues into adulthood and occurs later in “higher” regions of the brain, such as the prefrontal cortex.
The most detailed evidence for these maturation processes comes from magnetic resonance imaging studies, which underscore the changing arrangement of gray matter and extent of myelination in adolescents and adults. Such research makes the case, for instance, that marked differences do exist between the brain of a 13-year-old and that of a 25-year-old. (Click here to read a widely cited study by UCLA professor of neurology and neuroimaging Elizabeth Sowell.) But while the legal system draws specific (and often arbitrary) age distinctions, Sowell’s studies and others like them explore average features among groups of different ages (for instance, a 12- to 16-year-old cohort and a 23- to 30-year-old cohort in the papers cited above). Such studies are not designed to parse finer age-based distinctions—they do not differentiate between the neural maturity of a 17-year-old and, say, an 18- or a 19-year-old. In fact, most scientific work reflects the reality that the transition from adolescence to adulthood is a gradual process. (While a great deal of development occurs by early adulthood, the brain’s total myelination may not actually reach its maximum until roughly age 45.)
Another key argument advanced by the AMA and others on behalf of Simmons is that adolescents rely more heavily on the amygdala—an evolutionarily older area of the brain associated with “primitive impulses of aggression, anger and fear” —than adults do. This proposition seems logical, since the prefrontal cortex, which interacts with and in effect “reins in” the amygdala to temper impulsiveness and gut reaction with reasoning, does not fully develop until the early 20s.
However, at least one critical piece of research—cited by the AMA, the American Bar Association, and numerous media outlets—is considerably weaker (or at least less relevant) than it at first appears. The work is by Deborah Yurgelun-Todd, director of cognitive neuroimaging at McLean Hospital in Massachusetts, who asked groups of adolescents and adults to view a series of black-and-white images of fearful faces. Using functional MRI, a technique that maps areas of brain activation, she observed that when studying the images, adults displayed more activity in the prefrontal region, while in adolescents the amygdala lit up more noticeably. Research subjects were later asked to describe what emotions the photographs had conveyed, and adults, far more than adolescents, correctly identified fear. Of her findings, Dr. Yurgelun-Todd told PBS’s Frontline: “[W]ith emotional information, the teenager’s brain may be responding with more of a gut reaction than an executive or thinking kind of response. And if that’s the case … you’ll have more of an impulsive behavioral response.”
On closer examination, however, Dr. Yurgelun-Todd’s work does not entirely substantiate this point. As another brain expert, Dartmouth professor Abigail Baird, told me, teens tend not to be engaged by black-and-white photos from a different era (in this case, the ‘70s), and so their prefrontal cortexes are not as active. When Baird performed a similar experiment using contemporary color photographs of younger people, the adolescents’ “frontal lobes [went] bananas.” In other words, she said, they were “able to be more analytical when they [cared].”
Perhaps more important, Baird stressed distinctions between various parts of the prefrontal cortex: The areas most activated by the photographs of faces, she said, are located near the upper midline and upper outside parts of the face, near the temple (in neurospeak, the medial dorsal and dorsolateral regions); the area most closely associated with impulse control, on the other hand, is found directly over the eyes (it’s called the orbital region). Baird argued that photographic studies, while useful for showing how adolescents read faces, were “not really relevant to the question of impulse control” because they involved an area of the frontal lobe with a different function. (She also added that at least in her studies, the activation of the amygdala did not appear to differ at all between adolescents and adults.)
So what does this mean for the case of Roper v. Simmons? Ultimately, there are a host of reasons, which are beyond the scope of this article, not to execute a convicted juvenile, just as there are probably good reasons to eliminate the death penalty entirely. (Indeed, using brain-imaging techniques, it may be possible to show that many people who commit violent crimes have aberrant prefrontal cortical activity, or other brain features that differ from those of the average adult.) Still, in weighing the arguments in Roper, the court should not rely primarily on the neuroscience presented for the defense. Most of it is solid, reputable work, and some of it is relevant to the case. But overall, the research can’t make the fine, age-related distinctions that the court must consider; nor does it offer as clear or simple a message about adolescent impulse control as the popular press would have us believe. As Dr. Sowell told Science in May (before deciding not to speak to the media): “The scientific data aren’t ready to be used by the judicial system. The hardest thing … is to bring brain research into real-life contexts.” A decision to spare an immature defendant, then, should not turn on science that is itself not wholly mature.