No more plopping preschoolers in front of videos to “zone out.” With the emergence of touchscreen tablets and e-readers, screen time has become interactive—and thus less guilt-inducing for parents who need a short break. Every purposeful swipe of our children’s fingers seems to offer a reassuring signal that their minds are at work, contemplating what to do next.
Are interactive media really better, though? As Screen-Free Week carries on, and as electronic games for young children flood the market—72 percent of iTunes’ top-selling “education” apps are designed for preschoolers and elementary school children, according to a recent report from the Joan Ganz Cooney Center at Sesame Workshop—we need to find out.
Few studies looking specifically at tablets and touchscreen phones have been released yet. But research from Georgetown University, published in 2010, can give us a hint of whether interactive screen time has any potential for teaching children as young as 2½ years old. The short answer is yes, but the caveats are many.
The experiment at Georgetown engaged children in an electronic game that required no joysticks, no mouse work—nothing more than a single touch. Researchers randomly assigned children, ages 30 to 36 months, to one of three groups. Each was treated to a different version of a show that took place in a laundry room, where puppets would pop out from baskets or from behind pajamas hanging on the clothesline. In one version, the children watched the show play out on video. In another, they viewed the action on a computer screen and had to touch the keyboard’s space bar whenever they wanted to find out where the puppets were hiding. The third—the live version—asked children to watch an enactment of the show in a room set up to look exactly like what their peers were seeing on-screen. They watched through a windowlike opening the same size as a TV or computer monitor.
After watching or playing, each child was unleashed into that room to find the puppets. Which of these children would use what they had seen just minutes before to help them find the puppets?
Researchers found that the video-watchers went through a process of trial-and-error before they succeeded. It was as if they weren’t sure where to look. But the kids who had played the interactive game or watched the live demonstration did quite well, with most of them heading straight for the right place. Even the younger children—the 30-month-olds—made a beeline for the right hiding places, according to Alexis Lauricella, the lead author of the study. (She’s now a post-doc at Northwestern University.) Something about interacting with the content—about pressing that space bar to make puppets appear from their hiding places—seemed to improve their ability to learn from the screen.
In earlier studies, slightly younger children—24 months—struggled with these “seek and find” tasks after watching non-interactive video, unless they had a guide on-screen, a person or character, whom they felt compelled to respond to or communicate with. Even easier tasks, such as pointing to an object introduced a few minutes before, are more difficult for very young children after watching video compared with being taught face-to-face. It is this “video deficit,” which has cropped up in numerous other studies with infants and toddlers, that partially informed the American Academy of Pediatrics’ recommendation against screen time among children younger than 2. (The AAP has other concerns, too, such as whether parents are replacing human-to-human connections with screen time.) But the pediatricians who wrote the AAP guidance, which Farhad Manjoo has criticized in Slate for its narrowness, were focused only on what is typically called “passive” media, like TV and videos, not interactive media. Peer-reviewed research on under-2s using iPads has yet to appear.
But other research suggests that just because something is interactive does not mean it is the perfect learning tool, even among children over 24 months old. Some e-book studies at labs at Temple University and the University of California at Riverside—as well as a forthcoming report from the Cooney Center—show that the wow factor of the device and the presence of interactive “hotspots” on e-book pages may interfere with children’s ability to recall the story line of the book. This isn’t just a problem of electronics. Even traditional print-and-cardboard pop-up books can lead children at 2½ and 3 years old to learn less from the story than they would have otherwise, according to research at the University of Virginia conducted by Cynthia Chiong. Is all this interactivity more about attracting a child’s attention than teaching her?
The questions remind me of a moment several years back. My oldest daughter had just turned 4, and I bought some JumpStart software to see how the experience compared with her favorite preschool TV shows, puzzles and memory games. At my desk, with my daughter on my lap, we popped in the Language Club CD-ROM. In the game, children were supposed to dress a puppet by clicking on various clothing items, which the software identified out loud in whatever language was being taught. Once the puppet was dressed, it would dance.
The computer told my daughter that she could choose a hat, shirt, pants, and shoes. “I want to pick a dress,” she said. “That’s not an option, kiddo. Pick one of the four on the screen,” I told her. My husband came in, peered over our shoulders, playfully snagged the mouse and chose a black top hat. “No!” my daughter hollered. “I don’t want her to wear a hat.” The hat came off.
“Now I want to see it dance,” she said. But the software had something else in mind. “You’re doing great,” it said. “You only need one more type of clothing to complete the outfit.” Reluctantly, my daughter selected the top hat, launching the puppet into a little dance as a song labeled its arms, legs, and head. Though my daughter quickly forgave the software for its stubbornness and begged for more, this was not the learning experience I had hoped for. Granted, it was supposed to be a vocab game, not a child-directed fashion show. But the interactivity seemed like an add-on—narrow, arbitrary, and overly directive. Another drawback: When the song came to the word “arms,” it was the hands that moved.
Child development specialists say young children learn best when they are fully engaged and imbued with a feeling of control. They encourage parents to seek out more open-ended games and toys in which children could explore and create at their own pace. Yet at the moment, not many apps are built with this approach in mind. A recent Australian study showed that only 2 percent of “education” apps in the iTunes Store allow for open-ended discovery and exploration. (However, I have seen some recent products that favor creation, including DoodleCast, ItzaBitza and in-development computer programming software for preschoolers called Scratch Jr.)
My family never went back to the JumpStart software (and my kids have since grown out of preschool games). Instead, the research led me to come up with a mantra for when and how to use screen media with young kids. It’s the Three C’s: content, context, and your child. (OK, so I fudged the last C a little.) Be picky about the content of what children see on-screen, and when choosing interactive titles, seek out those that put children in control without so many dead-ends and distractions. (Common Sense Media, a nonprofit children’s advocacy organization, is making this a little easier with its just-released website that rates apps for their learning potential.) Focus on context by being aware of what is happening before, during, and after children play their games or watch their shows, taking time to talk about what they’ve seen, and play some games together. And to accomplish that last C, tune in to which games and shows really interest your kids, what piques their curiosity and helps them relate to people and things around them.
Our households may not be Georgetown labs, but we have to keep testing interactivity’s value: Is that touchscreen triggering actions and ways of thinking that could come in handy in the real world—or merely leading our kids to touch another button?
This article is adapted from Screen Time: How Electronic Media—From Baby Videos to Educational Software—Affects Your Young Child. Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate, explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.