Earlier this week, Buzzfeed News reported that Grindr had been sharing user information with third-party app-testing companies. This revelation unleashed a wave of criticism against the gay dating app for potentially putting its users at risk of having information from their profiles, including HIV status, released more widely than they had intended or understood when they consented to using the app. The company has since ended the practice.
In the wake of Facebook’s scandal regarding the sale of user information to Cambridge Analytica, the BuzzFeed report felt to many like another grievance in a long line of grievances regarding the misuse of social media–user information. For a lot of LGBTQ people, who use apps like Grindr with the expectation of a higher degree of privacy and security, the report felt particularly personal and egregious.
As an HIV advocate working in HIV-related health care, I felt the tug to join the tide of criticism; but after learning more about the technical aspects of the situation and considering the real stakes for users, I’ve settled into a more nuanced position. I don’t think the sharing of user information with app-testing companies poses a significant threat for abuse, even for information as sensitive and stigmatized as HIV status, but I do think there are necessary steps companies like Grindr can and should take to ensure user privacy and safety from abuse. Fortunately, the health care field has already set such a standard in the form of strict protocols from the Health Insurance Portability and Accountability Act (HIPAA).
Before looking at how the HIPAA could help Grindr, we should first consider what Grindr did and did not actually do. Late Monday, Grindr CTO Scott Chen took to the internet to defend the company, explaining that Grindr “has never sold” and will never sell user HIV status to advertisers. He added that confidential user information was shared with (rather than sold to) app-testing vendors Apptimize and Localytics, simply for the sake of app optimization and the development of novel features, such as Grindr’s newly released HIV Test Reminders. Chen assured users that user HIV status was encrypted, never shared with advertisers, and only came from information users willingly shared on their profiles.
This seems reassuring, but there’s still a problem: Chen’s assurances really only apply to HIV status, not necessarily to other confidential information. According to the BuzzFeed News report, Grindr not only shared other user information, such as location, sexuality, ethnicity, and phone ID, with third-party advertising companies, but it did so without encryption.
This raises two ethical questions. The first question specifically concerns Grindr’s handling of information as sensitive to its user demographic as HIV status, and the second concerns the larger conversation regarding whether or not any user data should be shared with third-party applications without explicit and informed user consent.
As the coordinator of an HIV viral suppression program at a community health center in Harlem, New York, I work daily with patient-protected health information. My position makes me privy to the HIV status and viral-load test results of a couple hundred residents of Upper Manhattan and the Bronx, and responsible for the safe, confidential reporting of that information to our funders, so they can track program efficacy and proper coordination.
But such information necessarily has to be shared in compliance with very strict rules set by HIPAA. When it comes to sharing such information outside of a clinical setting, whether with funders, for research, or for the determination of program efficacy or improvement, client information has been de-identified, meaning all specific personal identifiers, such as name or address, are removed from the data so that it can’t be traced back to any individual person. As the coordinator for a program that deals specifically with HIV status, if I don’t follow these guidelines, I’m held responsible.
This is where I have concerns about Grindr’s decisions. After reading Chen’s post, I do not believe Grindr acted with malicious intent in sharing user HIV-status data with Apptimize or Localytics, and I don’t believe it represents as significant a breach of user privacy as it has been touted. But I do believe that Grindr’s handling of user information, and associated policies regarding user consent, seems sloppy enough that any fears of potential for abuse are not without merit.
It’s on this point that I believe Bryce Case, head of information security with Grindr, gets things wrong. In a statement on Monday, Case blamed criticism on the public’s “misunderstanding of technology.” There is absolutely some truth to this, as the inner workings of the digital world are still largely esoteric to most social media users, including myself. But Case’s argument comes across as crass and unsympathetic in a time when many people’s worst fears about privacy and social media are being realized, if not even surpassed.
While the scandals affecting Grindr and Cambridge Analytica are vastly different in scope, both should serve as examples of how regulations regarding the use and misuse of user information have not kept in pace with social media’s evolution. In my opinion, HIPAA-style rules on the handling of sensitive information must necessarily be seen as a standard for the future of data sharing not just for Grindr, but for all social media applications.
Yes, users make the decision regarding what information to include on their profiles. But for that information to be shared with outside entities—even for the sake of something as innocuous as app development—companies should be required by law to obtain informed consent, and share data only in de-identified forms. Grindr’s use of client information might be in a more ethically gray area than it seemed when BuzzFeed News first published their report, but its consumers have the right to know not just what the company plans to do with their information, but what it has the potential to do, how that use can affect the user, and how much agency each individual user has.