The leak of the draft decision overturning Roe v. Wade has prompted numerous stories about the privacy of health data. For instance, Vice reported that data broker SafeGraph was collecting and selling the GPS locations of people who visited abortion clinics—in many cases, likely without their knowledge. There have also been numerous concerns raised about period tracking apps and other technologies that could enable surveillance of and even violence against those seeking medical care.
Most Americans assume that the Health Insurance Portability and Accountability Act, known as HIPAA, and other health privacy laws prevent entities like their doctor from sharing sensitive personal information—and that’s true. Health care providers are covered under HIPAA’s privacy rules.
But companies outside the narrow scope of HIPAA, from data brokers to period tracking apps, can legally sell Americans’ health-related information, and they do, from a list of your surgical procedures to your mental health conditions.
A few years ago, the more than 100 million users of Flo, a period and ovulation tracking app, learned something startling: The company was sharing their data with Facebook. “When a user was having her period or informed the app of an intention to get pregnant,” the Wall Street Journal reported, Flo would tell the social media company—which could then use the data for all kinds of activities including advertising. In 2021, Flo settled with the Federal Trade Commission for lying to its users about secretly sharing their data—and in the complaint, the FTC revealed the company had gone even further than was previously reported, sharing data with Google, Fabric (the then-name of a Google marketing service), marketing company AppsFlyer, and analytics company Flurry. Flo, in the FTC complaint’s words, “took no action to limit what these companies could do with users’ information.”
While this case may seem outlandish, it fits into a broader pattern. The data brokerage ecosystem—broadly, companies collecting, inferring, buying, selling, licensing, and otherwise sharing people’s data—is a multibillion-dollar industry in the United States. Companies most of us have never heard of, like Acxiom or Babel Street, transact in everything from individuals’ demographic information (race, sex, gender, and much more) to locations to political preferences and beliefs. They compile datasets on individuals, package them, and make them available on the open market, for sale, advertising, or other purposes. Just some of their dataset titles include “Suffering Seniors,” “Rural and Barely Making It,” “Ethnic Second-City Strugglers,” “Rough Start: Young Single Parents,” and “Credit Crunched: City Families.”
This matters in the case of conservative Supreme Court justices overturning Roe v. Wade because HIPAA does not stop many companies from selling people’s health information. As University of California Davis law professor Elizabeth Joh wrote for Slate, “the end of Roe v. Wade should be understood in the context of our vast and underregulated surveillance economy, and the reliance of law enforcement on it.” Even though law enforcement agencies may be prohibited from accessing certain company-held data without a warrant, they can typically circumvent legal and policy controls by purchasing that data. Many others, from angry individuals to right-wing organizations, can similarly weaponize the open data market against women and other people seeking reproductive healthcare.
HIPAA is limited in scope to certain, “covered” health entities, like your doctor’s office or a university hospital. It does not cover companies outside that scope—which means period tracking apps, online services to provide virtual therapy, and other digital health-related tools and services are typically allowed to legally buy and sell Americans’ health data. State laws are little help, either, as the data broker laws that do exist in California and Vermont do not limit companies’ ability to buy, sell, license, and otherwise share Americans’ data. For their part, the five state consumer privacy laws on the books—in California, Connecticut, Colorado, Utah, and Virginia—still permit companies to widely buy and sell individuals’ data, and their “right to opt-out” provisions place the burden on individuals to look up data brokers and submit do-not-sell-my-data forms of their own volition.
The health information you can acquire on the open market is broad and detailed, and this problem has existed for years. In 2014, the Federal Trade Commission published a study on data brokers and found data brokers compiling data on such health-related issues as “Expectant Parent,” “Diabetes Interest,” “Smoker in Household,” “Buy Disability Insurance,” and “Cholesterol Focus.” In 2013, the nonprofit World Privacy Forum released a report detailing how “data brokers sell lists of people suffering from mental health diseases, cancer, HIV/AIDS, and hundreds of other illnesses.” It also detailed data brokers advertising a list of domestic violence shelters and a list of rape sufferers—what the organization described as “an unjustifiable outrage that sacrifices a rape victim’s privacy for 7.9 cents per name.” In the recent case, data broker SafeGraph was collecting the GPS locations of people visiting family planning centers and abortion clinics and then selling it online (it claims to have stopped selling this data, even though it still seemingly collects it).
All of this data can be linked to individuals. To be clear: Buyers can’t request health data on a specific person as far as I can tell, though the brokerage industry is opaque and needs more study. But some data brokers straight-up provide information with individuals’ names attached. For those data brokers that remove names from datasets before sharing them, there are still grave risks of reidentification. More than two decades ago, Harvard professor Latanya Sweeney, author of numerous groundbreaking studies in this area, linked supposedly “deidentified” medical information to a “population register (e.g., a voter list) to re-identify patients by name.” More recently, a 2019 paper in Nature found that “99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes,” such as ZIP code, gender, and date of birth. It does not require much work to link health data to individuals—and with much of this available on the open market, the opportunities for harm are grave.
Take the case of Flo, the period and ovulation tracking app. After it was caught sending people’s data to Facebook, without their knowledge, Flo alleged that it “depersonalized” the information. However, individuals’ period and ovulation data was linked to an advertising identifier, which companies use to track people across devices and services—in other words, a string that can be linked to a specific person.
Law enforcement and right-wing organizations could attempt to acquire sensitive health data on women and other people to enforce anti-choice laws. They could buy information from data brokers on individuals who have visited family planning clinics, or they could even purchase data from period tracking apps or other companies directly. All of this can enable surveillance, targeted profiling, harassment, stalking, and violence. Even beyond abortion restrictions, the sale of this health data is incredibly dangerous, whether it enables foreign governments to hack and steal pre-consolidated spreadsheets of Americans’ mental health conditions or scammers to purchase datasets on seniors with Alzheimer’s and dementia to steal away their life’s savings. It also fundamentally undermines a basic assumption made by many Americans: that the government safeguards your medical information.
Even though protecting consumers’ privacy should be a bipartisan issue, Congress still has not passed a comprehensive privacy law. But it can still act in the meantime. Placing strong controls on the sale of health data would protect millions of Americans from the current threats they face to their privacy, bodily autonomy, physical safety, and even economic and national security. Predatory companies, law enforcement agencies acting without warrants, violent extremists, and other dangerous entities can purchase your sensitive health information off the open market, no matter whether you like it or not—and it’s time for policymakers to act.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.