The conventional wisdom around mental health goes something like this: If you’re feeling upset, triggered, anxious, or depressed, put down your phone, lest the steady stream of group pics, beach vacations, and “authentic” selfies makes you feel even worse. But Snapchat’s latest initiative offers a different solution. Now, if you type a word that’s related to mental health, such as “anxiety” or “bullying,” into the app’s search bar, original content from what the company calls “local experts” will appear.
This is part of Snapchat’s new feature called “Here for You,” which the company announced in honor of Safer Internet Day. Here for You launched on Tuesday and will continue to roll out over the next few months. The goal is to provide proactive resources in the app for those “who may be experiencing a mental health or emotional crisis.” The mental health content will appear on the results page in 10-second clips that users tap through, Fast Company reported, and it will also include curated shows on wellness-related topics.
While Snapchat hasn’t shared any data on how many users have searched these topics, Jen Stout, vice president of global policy at Snapchat, told Fast Company that sometimes users seek out this kind of content, but previously, they hadn’t been able to find any helpful results.
“We feel a real responsibility to try to make a positive impact with some of our youngest, sometimes most vulnerable users on our platform,” Stout said. “We know this is the first step of a lot of work we want to do to provide the right resources to our users.”
Snapchat’s move follows a broader shift in the social media landscape as companies have started to invest in mental health initiatives for their users over the past year. Instagram, which is generally considered the worst social media platform for mental well-being, has increased efforts to combat bullying through the “Restrict” feature, and has extended its ban on images related to self-harm. In a move similar to Snapchat, Pinterest developed a tool tied to the search bar last summer called “compassionate search,” which encourages exercises that use dialectical behavioral therapy techniques to combat stress, anxiety, and thoughts of self-harm.
From a cynical perspective, these initiatives look like PR moves. But what’s interesting here is that the relationship between social media and mental distress isn’t as clear-cut as many believe it to be. While a number of highly publicized studies link the two, others show that users can actually benefit from its use—particularly when they’re mindful about how they interact with it.
John Torous, the director of the digital psychiatry division at Beth Israel Deaconess Medical Center, told me that all studies of social media use are somewhat incomplete because they are correlational and cross-sectional. “No one’s actually shown the causal evidence that says, ‘If you have X exposure to social media, you have Y detriment to mental health,’ ” he said. Our best understanding of the issue so far may be the “Goldilocks hypothesis,” he explained, which posits that a moderate amount of exposure to social media is ideal for one’s health.
Still, Torous said, certain individuals—his patients included—report increased levels of stress and anxiety after social media use. For some of those users, he’s inclined to believe Snapchat’s new resources are positive, or at least neutral. While some may balk at the idea of staying on an app to receive any kind of mental or emotional support, Torous’ main concerns have to do with data privacy. What happens to the data on individuals’ mental health that’s collected, and can it be protected?
“Because so many of these services keep the data to themselves,” he said, “there’s not really very much transparency on what’s happening on these platforms.”
Torous used the example of Facebook to illustrate the problems inherent to social media companies keeping tabs on their users’ mental health. In 2018, the New York Times reported that Facebook was running natural language processing to look for key words related to suicide, which on some occasions led to police being sent to users’ homes. In Facebook’s case, there was no informed consent, and the company was essentially running its suicide intervention unchecked. Where that data goes—and what it means to be “flagged” as someone with a risk of self-harm—is unknown. “The Facebook example has made me wary,” Torous said. “When a company says it’s doing stuff for mental health, what are they actually doing?”
As for user experience, we don’t have enough information right now to know how effective Snapchat’s interventions will be. But even if the results are positive, Torous cautioned, “[W]e don’t want to conflate this with [saying that] the potential issues of social media for some people and its negative mental impact are now going to be ameliorated, fixed, or addressed. I think it’s adding another kind of unknown into a very complex puzzle.”