Future Tense

Why You Shouldn’t Let a Startup Scan Your Eyeball in Exchange for Crypto

Or anyone else who wants to, for that matter.

A close photo of a green eyeball with eyelids wide open.
Don’t give Worldcoin a photo like this. Karen Bleier/Getty Images

Some of the most powerful investors in Silicon Valley want to scan your eyeball. You almost certainly shouldn’t let them.

OpenAI CEO Sam Altman, LinkedIn co-founder Reid Hoffmann, and major venture capital firm Andreesen Horowitz are all backing a recently revealed plan by a company called Worldcoin, which mashes up three big ideas: It’s a cryptocurrency company, and it’s a Universal Basic Income project, and also it’s a biometric-scanning company. Worldcoin: It’s a coin for the world. If, first, the world will share its irises.

Advertisement

According to a recent report by Bloomberg, Worldcoin’s goal is to use cryptocurrency as a way to spread money more equitably around the world in a setup similar to a universal basic income. A UBI is a system in which every member of a society receives a regular government payment to cover their essential needs, potentially eliminating the need to work as automation increases. To make its proposed system of redistribution and authentication work, the company intends to use eye-scanning orbs—less than 20 of which are currently being tested—to confirm users’ identities. In essence, the test subjects are getting paid with cryptocurrency in exchange for sharing access to their irises.

Advertisement
Advertisement
Advertisement

Worldcoin’s job postings promise to create “a new global digital currency that will launch by giving a share to every single person on earth.” Bloomberg explains that the company wants to use dedicated hardware devices (the mysterious orbs) to ensure the humanness and uniqueness of users, while also protecting privacy and the transparency that blockchain is founded upon. According to the head of the new startup, a former theoretical physicist named Alexander Blania, the scanner will produce a unique numerical code for each person’s iris and then delete the image. But outside of that, very few details about how the orbs work and what the security measures for storing this numerical data are have been provided.

Advertisement

Digital privacy experts have raised significant alarms over the use of such an authentication method—and the uses of biometric identification in general. As Cindy Cohn, the executive director of the Electronic Frontier Foundation, explains: “biometric identifiers are extremely dangerous, they can’t be revoked, they can’t be reissued.” You’re pretty much stuck with your iris, fingerprint, and DNA forever—this isn’t a password that can be changed whenever it’s compromised. Given the slew of data breaches of digital companies in recent years like Equifax, TransUnion, and LinkedIn, Cohn argues that “the idea that [biometric information] would go to one company and never go anywhere else really ought to be laughed off the face of any presentation about any of these kinds of identifiers.” While the exact privacy and security consequences of leaked iris scans may be limited currently, submitting to mass use of biometric identifiers from private companies is opening a Pandora’s box that may never close again. That alone makes me scared enough to never give Worldcoin my eyeball.

Advertisement
Advertisement

Now, that’s not to say that biometric identifiers are anything new, or that they’re even bad on face value. You probably already use some every day—unlocking your iPhone or accessing an app with your face. Those uses have raised some privacy experts’ alarms; they are also very convenient, and thus have seen wide user adaption. Ross Schulman, senior counsel and policy technologist at New America’s Open Technology Institute, points out that “there’s nothing inherently bad or evil about biometric identification.” (New America is a partner with Slate and Arizona State University in Future Tense.) Biometric identifiers can operate as “good means of authentication,” plus you “can’t forget them.” Furthermore, even if the data were leaked from a company like Worldcoin, Schulman explains that it would be useless without the algorithm used to evaluate it—its not like “they just take a photo of your iris and store that somewhere,” he says. The security Worldcoin is using is almost certainly a lot more complicated. But if a hacker obtained your biometric information and got access to the company’s algorithm and encryption process—that is, the means of interpreting the numerical code generated by your iris—that could be an entirely different story. Plus, with Worldcoin, the company will have to maintain the data from your iris scan, whereas with an Apple, your face data is stored locally on the device, not in the cloud.

Advertisement
Advertisement
Advertisement
Advertisement

But the topic that concerns both Schulman and Cohn is what Worldcoin plans to do with the data. Nobody is 100 percent sure, as the full details of the crypto company’s operation have yet to be publicly released. Of the biometric data, Schulman asks, “Can it be used somewhere else?” Cohn worries that “you can’t just look at what Sam Altman wants to do with this thing, right now. You have to think about what future uses could be made for it?” News of Worldcoin was reported before the company had planned to announce it, so much so that the top website when you google “Worldcoin” is for a completely different cryptocurrency called Worldcoin Global. And Worldcoin Global makes it clear that iris scanners are not in its future:

Advertisement
Advertisement

Even if Worldcoin has the purest intentions and never sells any of the data or personal information, there’s still a big player in the tech scene who might want access to your iris scans and crypto-transactions—governments. Cohn points out that if you think that when you provide biometric data that “you’re just giving it to a private company, you’re probably wrong, because governments really want this information.” And while Worldcoin promises to delete the photo of your iris, there’s legitimate concerns over how governments might influence the company in the future. The United States has a well-documented history of getting big tech companies to hand over personal data from their users. China already uses facial recognition and biometric technology to identify people on camera systems, and recently worked with Tencent—a major gaming and software company—to use facial scans to catch Chinese children playing video games after a government-instated gaming curfew.

Advertisement
Advertisement

There are also some concerns to be raised about how effective this iris scan system will be. As Schulman points out, we don’t yet know “what the false positive rate is” for Worldcoin’s system. No system is going to be perfect at identifying people every single time, and in a matter as sensitive as authentication for financial transactions, there is a huge difference between 99 percent accuracy and 99.99999 percent—and so on. In fact, Matthew Green—a professor at Johns Hopkins Information Security Institute—told Recode that “Nobody has any idea how to build an affordable iris scanner that isn’t vulnerable to some kind of spoofing.”

Advertisement
Advertisement

Even assuming the iris scan system put in place by Worldcoin is overwhelmingly effective, this seems like an especially risky move given that there are effective, cheap, and existent alternatives to biometric scans for improving user security and authentication. “Biometric identification, while it is better on one dimension, is really, really bad in another…you have to look at these things holistically,” Cohn explains. In other words, biometrics might be a great way of authenticating identity, but when considered holistically, the privacy concerns likely far outweigh the benefits. Cohn and Schulman both point out that we just don’t need to use biometric information to confirm people’s digital identities. First, it’s important to keep your apps and software fully updated. There are also simple security solutions like dual-factor authentication. Plus, there are physical solutions like a YubiKey. These are physical devices that you carry with you and plug in to computers that support added security measures to authenticate users and can provide one-time passwords for access to accounts and systems.

Advertisement

And even if more companies follow Worldcoin’s move toward biometric authentication, there is beginning to be pushback by activists and regulators against the use of the technology. Maine recently passed the strictest state-level facial recognition ban in the country, prohibiting the vast majority of uses of facial recognition technology by law enforcement. But it’s unclear whether  information obtained by technology like Worldcoin’s would fall directly under such legislation if police in the state sought it. One of the gold standards in regulatory protections against improper uses of biometric information is Illinois’ Biometric Information Privacy Act. Under BIPA, companies must inform users what data is being collected, the length of time collection and storage will occur—both in writing—while also obtaining a user’s written consent. This includes a slew of biometric information, from iris scans to voiceprints and DNA. An extension of this law nationally could go a long way in fighting against unwanted uses of our data and biometric information domestically. (In August 2020, Sens. Jeff Merkley and Bernie Sanders introduced exactly that—the National Biometric Information Privacy Act, though the legislation never made it out of committee.)

Advertisement
Advertisement
Advertisement

It’s unclear how Worldcoin’s iris-scanning system will make money, so it would be premature to assume their product will see wide adaptation the way the iPhone has furthered facial-identification technology. At least to me, this is one case where ease-of-use doesn’t justify an unneeded technical solution whose negative externalities we’re likely underestimating. While we all work and grapple with how to best move forward in securing ourselves and our biometric data, in situations where you have the option to consent to the collection of sensitive information, the easiest solution is to say no.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement