Future Tense

How Exactly Will SAT Test Takers’ “Adversity Score” Be Calculated?

The College Board’s move to even the playing field is another algorithmic black box.

A locked black box sits atop a sheet of standardized test answer bubbles.
Photo illustration by Slate. Photos by WoodyAlec/iStock/Getty Images Plus.

On a scale of 1 to 100, how much adversity have you faced? The College Board, the nonprofit that administers the SAT and other standardized tests, is rolling out an algorithm to evaluate that as part of a larger suite of scores it calls the Environmental Context Dashboard. The ECD is designed to provide admissions officers with information about where a student comes from—but students may be left guessing about how much adversity the College Board thinks they’ve faced.

According to the College Board, the ECD uses census data and includes metrics like the median family income and poverty rate in a student’s neighborhood and high school. All of that info is crunched into a single number between 1 and 100, with 50 as the average, presented as the student’s “overall disadvantage level” compared with peers nationwide. In addition to the overall number, the dashboard shows what percentile the student’s SAT score falls in at their high school, as well as the percentages of seniors taking AP classes and students eligible for free or reduced lunch at that school.

But the College Board has been mum about exactly how the overall score is computed. (It is, however, emphasizing that the term “adversity score” is inaccurate.) And while all this information is available to admissions officers, applicants are not able to see their own score. Though these metrics appear to be a well-intentioned effort to guide schools toward more holistic admissions standards, it’s unsettling that a student’s fate could be determined by an opaque algorithm.

The SAT has been roundly criticized for decades, with critics asserting that scores are more correlated with income, coaching, and race than scholastic aptitude. In more recent years, we’ve seen a growing movement to deprioritize standardized test scores. More than 1,000 schools do not require students to submit SAT or ACT scores as part of their admissions applications, and subsequent research has found there is virtually no difference in the grades and degree completion rates between students who do and don’t submit standardized test scores. By introducing the ECD, the College Board seems to be acknowledging that many external factors shape students’ performance on standardized tests. It’s also creating an opportunity for its services to remain relevant in the college admissions game: The ECD could be useful even to schools that are moving away from requiring test scores.

The ECD is just the latest attempt to outsource nuanced decision-making to an algorithm at the risk of exacerbating societal biases. Many facets of our lives are ruled by such algorithms—which people charged with crimes must await trial in jail, whether police keep special watch over our neighborhood, which Facebook ads we see, whether our children are taken from us if someone calls child services—yet we have limited insight into how they work. Even the people who code these algorithms cannot necessarily predict what their creation learns as it’s fed data; often, models pick up the same biases that pervade our society. Amazon’s hiring model was scrapped after engineers discovered it discriminated against women; a model that taught itself English ended up with biases against women and black people.

Given the racist history underlying the development of standardized tests and gatekeeper-y status the College Board has played over decades of college admissions, many are skeptical of the nonprofit’s motivations and abilities to craft an equitable admission tool. “Who gets to decide whether these students have faced adversity or not?” asks Blanca Vega, an assistant professor of higher education at Montclair State University. Students won’t know what’s part of the score, and how each component is weighed is not known to students—and “that’s part of the inequity,” says Vega. On Twitter, mathematician Cathy O’Neil pointed out that the opacity of the algorithm has the power to harm students, who will remain in the dark about how this mysterious score affects their shot at admissions. “Categories of students that are not well understood by the scores—and they will exist—will see their entire college application experience get worse,” she wrote.

Universities that use the ECD for admissions are essentially trusting the researchers at the College Board to be the arbiters of what it means to face adversity, and whether the metrics they’ve chosen are a good representation of that. To the College Board’s credit, it has consulted with education researchers while creating the ECD, but by keeping the algorithm’s parameters a secret, it’s missing out on opportunities for feedback from a broader swath of experts and the students affected by these scores.

After all, judging an applicant by their statistics from their high school or neighborhood leaves out a lot of nuance. Imagine a gentrifying neighborhood, where wealthy young white folks are displacing generations of people of color. On paper, all those families’ kids could have similar stats according to the ECD—we just don’t know, given what little the College Board has released about the process.

And it’s worth opening up the conversation about what adversity really means, and whether the metrics the College Board has chosen measure that well. Should adversity be a measure of race or social class? Consider, for instance, one of the statistics the ECD uses to calculate a student’s overall score is “probability of being a victim of a crime,” which certainly could represent a kind of adversity. But given what we know about black and Hispanic neighborhoods being disproportionately targeted by police, this metric may well be more a measure of the applicant’s race. As two researchers write in a 2016 paper on predictive policing: “If police focus attention on certain ethnic groups and certain neighborhoods, it is likely that police records will systematically over-represent those groups and neighbourhoods. That is, crimes that occur in locations frequented by police are more likely to appear in the database simply because that is where the police are patrolling.”

According to the Wall Street Journal, Florida State University, one of the 50 or so schools that used the ECD in their most recent round of admissions, reports that their freshman enrollment of nonwhite students grew from 37 percent to 42 percent this year. Though it certainly is true that people of color face barriers in higher education and are more likely to face poverty, that does not necessarily mean that “adversity” and “nonwhite” are synonymous. (This change in enrollment doesn’t necessarily show cause and effect. While it’s certainly possible the ECD drove some of that change, these numbers alone can’t tell us how much change it caused.)

While transparency from the College Board could improve the product and students’ experiences, it’s not hard to see why the organization may not want to reveal more than it already has. A proprietary algorithm could be profitable, after all, and divulging its inner workings means being vulnerable to scrutiny—and having to find ways to address criticism. Releasing scores to students could have similar consequences; Vega told me she could see angry students, especially those from well-off families, suing if they felt their adversity score wasn’t fair. It’s also possible that releasing either model details or student scores could allow people to game the system. After all, people (especially the wealthy) are highly motivated to find a way to better their kids’ shot at a quality education; Chicago families have faked addresses in less tony neighborhoods to get their children a better shot at getting into competitive schools.

I asked the College Board why it’s not making ECD scores available to students or details about the algorithm available to the public. Apparently this has been a popular question: “We have received questions about whether students and schools can see the content of the Dashboard, and we’re looking into how we might make it available to them,” a representative told me.

It’s easy to hate on any attempt to flatten a person into a series of numbers, but the information presented in the ECD, especially if offered transparently, could be a net positive in college applications. Nadirah Foley, an education Ph.D. candidate at Harvard University who worked as an admissions officer at the University of Pennsylvania for a year, says that the ECD “may very well be a useful tool.” Currently, information about schools, like the number of AP classes they offer and graduation rate, is not standardized, so the ECD could help admissions officers have an easy way of accessing this information. And preliminary results from education researchers suggest that accessing the ECD could lead admissions officers to admit students with “more adverse” ratings.

Given the College Board’s decadeslong monopoly on the college admissions game, the ECD and its other products are unlikely to go away any time soon. It’s heartening that the organization is working with researchers to study the effects of the dashboard on admissions, but what they’ve done so far only tells us what happens when admissions officers get their hands on what their black box spits out. The real question remains: What’s inside?

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.