“Your face is your ticket,” goes the motto of A.I. startup Wicket. “Your face is your credential,” says Alcatraz AI, another vendor.
Both these companies sell facial recognition technology to sports stadiums across the country. Citi Field, home of the Mets, contracted with Wicket in 2022 to add facial recognition ticket kiosks to all stadium gates. BMO Stadium, home of the Los Angeles Football Club, began using Alcatraz AI technology the year before.
But their promise isn’t absolute: Even if you have a purchased ticket, your face can leave you ticketless. In recent months, Madison Square Garden earned headlines for using facial recognition technology to ban or kick out people with tickets to their events. A Long Island attorney was removed from a Knicks game in December after getting flagged by the software. In January, a loyal Rangers fan was barred from watching his beloved team.
The use of facial recognition technology at sports stadiums goes far beyond MSG. I’ve tracked at least 20 other venues and stadiums across the country—including college sports venues—that have used this technology on their attendees, usually to admit them through the gates, although it’s unclear just how broadly this technology can be used by venues if they are inclined. The venues are:
● Mercedes-Benz Stadium in Atlanta, which announced in August 2022 that it was testing facial recognition technology for gates and concession stands.
● FirstEnergy Stadium in Cleveland, which offers “Express Access” with facial recognition technology.
● Citi Field in New York City, which has face-ID ticket kiosks at stadium gates.
● Pechanga Arena in San Diego, which installed facial recognition for entry scanning and payment verification.
● Save Mart Center at California State University, Fresno, which enables entry and payment with facial recognition tech.
● Lower.com Field in Columbus, Ohio, which has express entry with face-ID ticketing.
● FedEx Field in Landover, Maryland, which uses facial recognition for entry.
● Caesars Superdome in New Orleans, which uses facial recognition tech for entry into training facilities.
● Toyota Arena in Ontario, California, which announced in 2022 that it was installing facial recognition for ticketing and concessions.
● Sun Devil Stadium at Arizona State University in Tempe, which was being used as a “living lab” to employ facial recognition technology that will analyze how fans feel “based on their facial expressions.” (Disclosure: ASU is a partner with Slate and New America in Future Tense.)
● Hard Rock Stadium in Miami Gardens, Florida, which uses facial recognition for ticketing.
● BMO Stadium in Los Angeles, which began using facial recognition technology for entry into training facilities but wants to “move everything to face.”
● The Rose Bowl in Pasadena, California, which used facial recognition on 30,000 attendees without their knowledge in 2020.
● And many stadiums use TendedBar machines, which scan your face in order to serve you alcohol.
There are almost certainly many more, according to experts who say the lack of transparency about the use of technology has obscured its spread. It represents an extension of the surveillance network in private spaces that helps to amplify the power of law enforcement.
James Dolan, owner of the New York Knicks and Rangers and CEO of the company that operates Madison Square Garden, is apparently a man of many enemies. His company used the facial recognition software—which MSG has had since 2018—to scan for attorneys from an estimated 90 law firms with active litigation against the company, and ban them. A spokesman for MSG Entertainment said the technology does not retain images of individuals, “with the exception of those who were previously advised they are prohibited from entering our venues” or whose previous misconduct in their venues “identified them as a security risk.”
In a way, some say that at least Dolan, who recently defended his use of the technology, was honest about it. “I’m actually slightly grateful to him,” Albert Cahn, executive director of the advocacy group Surveillance Technology Oversight Project, said. “He was willing to explicitly admit that he was using the technology this way.”
Facial recognition technology is in high demand by sports teams. A 2021 study of 40 venue directors representing teams from Major League Baseball, Major League Soccer, the National Basketball Association, the National Football League, and the National Hockey League indicated that the software was on the top of the wish lists for venues.
Christian Lau, chief technology officer of the Los Angeles Football Club and BMO Stadium, told the Wall Street Journal in 2020: “Our plan is to move everything to face.” The following year, BMO began using facial recognition technology from California-based Alcatraz AI. The company’s “Rock” system can also be used to ascertain whether certain people should be allowed to enter specific spaces, including medical facilities.
Other providers—like Trueface, which claims to be the “fastest face recognition in the world”—have moved into enabling payments and verifying customers’ ages in stadiums. Their software is used by TendedBar, an “automated cocktail bar,” which touts the efficiency of scanning faces over checking ID cards, while giving “insight and analytics from anywhere.” The cashless opt-in machines—a combination of a soda dispensary and self-checkout system—have been installed at venues including Circuit of the Americas in Austin and TIAA Bank Field in Jacksonville, Florida. A TendedBar representative said that 10 stadiums across the country use their machines. Once a customer signs up at one location and their age is verified, they have access to the machines in all other stadiums.
But in such an unregulated space, experts have little faith that the software popping up from different companies is necessarily legitimate. “There’s so much snake oil around and in the sector,” said Daniel Schwarz, privacy and technology strategist at the New York Civil Liberties Union.
And the software is the most dangerous when it doesn’t work. It is widely known that such technology largely misidentifies people of color. Paired with exclusion lists, like the kind at MSG, this can lead to people being wrongly denied access to public accommodation. In 2019 the ACLU conducted an experiment with photos of high-profile professional athletes from Massachusetts sports leagues, such as former New England Patriot Duron Harmon. Facial recognition software erroneously matched the official headshots of 27 athletes to mug shots in a law enforcement database.
With the technology proliferating at stadiums, there is a lot of opportunity for error. Despite this, NHL Commissioner Gary Bettman recently said that the use of the technology at MSG, home of the Rangers, is “not anything that concerns us.”
Privacy experts are also worried about the way data can be shared with law enforcement and the expanding surveillance network it creates. “It’s harder [for law enforcement] to set up in private locations, but the companies are kind of doing it for them,” said Katie Kinsey, chief of staff of the Policing Project at NYU Law. “Oftentimes, law enforcement only needs to ask these companies to hand it over; there’s no process that is required.”
Currently, only Illinois, Texas, and Washington have enacted laws that regulate the use of facial recognition technology, and only the Illinois law gives the right for a private person to bring a claim against a violation. The state’s Biometric Privacy Act also imposes requirements such as imposing written informed consent before conducting facial recognition technology.
This means that the Chicago Theatre, owned by MSG, cannot use facial recognition technology. And while New York does have a biometric disclosure law—which passed in 2021 and mandates businesses to post formal notices indicating the use of the technology near all physical entrances of the building—experts argue that the system isn’t enough.
The notices are “somewhat akin to restaurant grading posters,” said Schwarz.
Lawmakers in New York have been vocal about the MSG fiasco but have so far pushed for little legislation that actually impacts facial recognition in the private sector. It keeps being “slow rolled” in the New York City Council, according to Cahn of the Surveillance Technology Oversight Project. “The sort of political opposition to actually having a public debate by lawmakers is truly, truly stunning.”
There has been federal pushback on one particular facial recognition company. Clearview AI is known as the “notorious bad boys of facial recognition,” according to Conor Healy, surveillance expert at IPVM, a surveillance industry research group. Back in 2020, BuzzFeed reported that MSG, alongside 200 other private entities, had contracted with Clearview AI, a facial recognition startup with a database of billions of photos involuntarily scraped from social media and the internet. (Clearview said that MSG briefly tested Clearview AI’s technology in 2019 for the purposes of “after-the-fact investigations,” not for use in “real-time situations.”)
Clearview AI’s thousands of customers have included agencies such as the U.S. Immigration and Customs Enforcement, the Federal Bureau of Investigation, and the Justice Department. In May 2022, facing a lawsuit by the ACLU and other nonprofits for violating states’ individual facial recognition laws, Clearview agreed to restrict U.S. sales of facial recognition mostly to law enforcement. The company told me that their database is used only by government and law enforcement agencies.
It is unclear what company MSG contracts with now, but Clearview’s demise clearly left a void that many other vendors have sought to fill.
And despite the criticism of such surveillance, the industry continues to expand and convince stadiums that they are providing adequate safety and security. For example, Oosto, an Israeli-based facial recognition company previously known as AnyVision, advertises its technology to stadiums in the face of what it terms an eruption of stadium violence, with fans who “have forgotten how to behave.” The New Orleans Saints bought Oosto’s technology to use at the Caesars Superdome in 2021.
Oosto said that many professional sports teams use facial recognition technology to “enable players and staff to quickly and conveniently access locker rooms or training facilities without having to stop or slow down.”
According to a page on preventing stadium violence on Oosto’s website, “The big difference between Roman times and today is that technologies now exist that can help mitigate the problem and lead to better outcomes and accountability.” But to Kinsey at the Policing Project, the example of MSG shows that the technology isn’t benign. “It wasn’t about security at all,” she said. “It was about revenge and retribution.”