GoGuardian is a software company that makes, essentially, spyware: software that helps teachers and schools block and monitor what kids are doing online. When a student is using a school-issued Chromebook that has GoGuardian on it, the teacher can see just about everything they’re doing. These technologies have been embraced by teachers and state Departments of Education alike, but students are less enthralled with having their online lives constantly surveilled.
On Friday’s episode of What Next: TBD, I spoke with Priya Anand, a tech reporter for Bloomberg who wrote a story on GoGuardian, about the rise of the school surveillance state and the implications of this technology for student’s mental health and privacy.
Lizzie O’Leary: You wrote for Bloomberg about Pekin Community High School in Illinois, which has been using GoGuardian for three years. Tell me about how the technology is being used there.
Priya Anand: Teachers can, at the start of class, start what’s called a session. They can set the rules for what everyone who’s in their class is allowed to do on the computer in a class. Then at the end of class, they also get a report on what everyone was actually doing online while they were in class. That way, they don’t have to go around and actually peek behind every single screen.
It’s steps further than just saying, “OK, YouTube is not allowed.” For example, if a student is typing something into a Google doc, it can flag for an administrator that a student typed whatever it might be into their Google doc, and then quickly deleted it, for example.
GoGuardian has been around since 2014, but the pandemic seems to have really supercharged use by schools and school districts. How is the pandemic fit into this story?
Schools were already giving kids laptops, and the Obama administration had made a push for schools to get into the digital era. But during the pandemic, so many schools that were holdouts or maybe didn’t give every single kid a laptop were then thrusting devices into their kids’ hands and saying, “Just keep them and do your stuff all through your computer.”
What was the pitch that GoGuardian made to schools and to districts about why this software should be something that they should invest in?
Administrators told us that for them, it was like sending a teacher home with a kid. With GoGuardian—since teachers can see what’s going on on a kid’s screen, and so can administrators—they felt like it was akin to having a teacher walk behind a student. They could then see, “Johnny is just playing video games all day on a school computer. These three assignments for these classes have not been done. What’s going on? Now we know that this kid is maybe having a hard time because of the pandemic.” That was the argument.
A lot of schools have returned to in-person learning. How are they using these tools now?
What I found really interesting when I visited that school in Pekin, Illinois, was teachers told me that they find GoGuardian more useful when they are in the classroom. For example, they had A days and B days, where every kid came in every other day last year. When kids were at home, they didn’t tell them, “Sit at your computer eight hours straight. You have math class at 8 a.m., English at 9 a.m.” They gave kids assignments. They had to log in for attendance by a certain time, but the assignments were very much complete them as you wish. They also knew that some of their students had taken on jobs to be able to help support their families.
Since kids weren’t necessarily in math class at 8 a.m., math teachers didn’t want to block things like YouTube, for example, because a kid in a language class might need to watch a Spanish video to do that assignment. But in the classroom, teachers feel like it’s a great tool for them to be able to say, “Nobody is going to open Netflix during my class. Nobody is going to open Minecraft during my class.”
What type of reach does this company have? How many kids, how many schools, what kind of numbers are we talking about?
GoGuardian told us that their potential reach is more than 23 million students, which is a pretty sizable portion of the K to 12 population in the U.S. In Delaware and West Virginia, for example, the state Departments of Education signed contracts to offer GoGuardian to all their schools.
When you talked to parents in Pekin, a lot of them were more than fine with this kind of technology being used on their kids. They compared it with the parental controls that they have at home.
By and large, it seems like across the country there’s not a huge groundswell of parents saying, “Stop this now. We don’t like this. Don’t track our kids online.” But there are parents in some pockets of the country, like Montclair, New Jersey, for example, who did protest GoGuardian’s implementation. The school district sent out a note earlier this year saying, “OK, we’re testing this out.” And parents there felt like, “Where does the tracking end?” Because GoGuardian does have a feature you can turn on that’ll track kids on personal devices or family computers at home, if they’re logged into their school account. One of the parents I spoke with was concerned that, do kids have the right anymore to have space to themselves? He said, “When I was a teenager, I just wanted to shut the door sometimes and have some time to myself.”
The response from GoGuardian is that they serve a more important purpose than just keeping students on task. The company says its algorithms can detect troubling searches, like content about suicide. The software can then alert administrators about kids whose online behavior might mean they’re experiencing a mental health crisis.
GoGuardian really pitches itself as a tool that can help schools understand kids’ mental health, possibly slipping to the point of self-harm or harm to others. Being able to track if they search for something that might indicate that they need help.
Is there any evidence that that’s true? That these digital red flags have helped schools step in and help kids who are in crisis?
You’d be hard-pressed to find empirical hard evidence by a third-party researcher. But Pekin Community High School, for example, shared anecdotes about how they feel. Even if they catch one kid who might be slipping, they feel it’s worth it.
The algorithms that GoGuardian uses are proprietary. If you are using a proprietary software tool in a public school to trigger alerts about some child to parents, administrators, and teachers, it seems tough if the algorithm itself is a black box.
Critics do say that these companies, their algorithms, operate in a black box. And, if they’re heavily influencing how public schools are handling decisions about children in those schools, there should be more oversight. It’s an open question. We don’t know how these algorithms actually make these decisions. Sen. Elizabeth Warren, along with two other senators, sent a letter to GoGuardian and a couple of its competitors, asking for more of an explanation on how the algorithms work, and have the companies considered whether their algorithms account for potential bias? Have they considered whether their algorithms could help compound racial disparities in school discipline?
The letter from Sen. Warren made me wonder a couple of different things. Let’s say you’re an LGBTQ kid, and you’re looking for help online. Or you are Googling some stuff to kind of work through your sexuality, but you haven’t discussed that with your parents or anyone at school. Then, that gets flagged by the school. That seems like you could put children in a very difficult and uncomfortable position.
I’ve talked to privacy experts who’ve said, “What if a kid is Googling something about their identity that they’re not ready to share yet, and administrators and teachers see that? It could either influence their perception of the kid, or their behavior toward the kid, or put the kid in the position of having to explain themselves in a way they might not be ready for.”
You have this line in your story that really stuck with me after I read it. It’s, “But no one actually knows how well, or even if, these technologies work.” What does it mean for this stuff to work? What’s the yardstick?
More than 80 percent of teachers say that their school uses some kind of monitoring to track what kids are doing online. Among students, that research found that at least 26 percent are not comfortable with it, but the interesting thing is more than 80 percent said they reported being “more careful about what I search online when I know what I do online is being monitored.” Six in 10 students said that they agree with the statement, “I do not share my true thoughts or ideas, because I know what I do online is being monitored.” If they know they’re being monitored, but six out of 10 say they’re not sharing how they actually feel—then if the true point is to catch what they’re actually feeling, are you getting there? Is it actually working?
Are there other equity implications with this technology?
Well, are there schools that don’t turn on the extended monitoring that allows them to see what students are doing even on their personal computers? Are there kids out there who are just using their personal computers instead of their school computers? And therefore aren’t being monitored, because their families are well off enough to get around the fact that their school device is being tracked? Kids whose families can’t afford another device for them to do their schoolwork on, are using a school laptop and thus being tracked in a way that the richer kids aren’t. Do rich kids get more privacy?
When the pandemic began, a big question that lots of people asked was, “What happens to learning if we close the schools?” We know some answers to that now, but I wonder about the technological legacies of this pandemic. Could you ever see schools going back to a world where this kind of software isn’t used?
It’s hard to see a world in which schools would abandon this kind of technology. By all indications, schools seem to find this useful, even in non-remote times when kids are in school in real life, butts in chairs in the classroom, to make sure they’re on track. So it’s hard to imagine a world where the school surveillance state is put back in a box.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.