You would think fitness apps would have learned from Strava’s example, but that doesn’t seem to be the case. Six months after discovering that Strava was inadvertently revealing sensitive military-base locations through its user activity–sourced Global Heat Map, Polar’s fitness app came under fire for a similar inherent “vulnerability.” Between the app’s privacy and activity-sharing settings, the Polar Flow app made it possible for researchers to pinpoint a user’s home location, including for military and intelligence personnel, through its map of user activities. The incident is another instance where mobile apps, particularly ones with a social dimension, either aren’t taking enough security precautions to protect location information from unwanted access, or users aren’t being educated on how to adequately use provided safeguards—or both.
The issue with Polar’s app is slightly different from Strava’s flaw. Researchers with De Correspondent, an Amsterdam-based news site, and Bellingcat, an online investigation house, discovered that between Polar’s privacy settings and search features, there were holes that allowed them to learn the location of military staff, intelligence operatives, and employees at nuclear weapons sites. Using information originally gleaned from the app, the investigators were able to find names and addresses of personnel at intelligence agencies such as the NSA and Secret Service, the U.K.’s GCHQ and MI6, and the Netherland’s MIVD.
“We found this information not through hacking or some other technological wizardry, but through a little clever searching in the online map that Polar makes available to anyone with an account,” the researchers wrote for De Correspondent. They tested other apps such as Strava, Runkeeper, and Endomondo as well, but while still possible, it was far more difficult to identify users’ home addresses with these apps (and not possible for users whose profiles were set to private).
According to Polar, user accounts are set to private by default, and only 2 percent of its user base shares workouts to its activity map. However, some private users still shared workouts publicly, and these activities contained a user ID that could then be linked through multiple publicly uploaded activities. Using the app’s search feature, the researchers were able to use this to learn the names and locations of individuals, and then pull up their entire activity history—which then offered clues as to their home-address location, even if the user had set their profile to private. Conversely, the researchers could start their searching from a specific location, identify users whose activities typically started from that spot, and then deduce their identities.
Once made aware of these revelations, Polar temporarily suspended its Explore API and the Flow Explore feature, the tools in its app that allowed the researchers to gain insight into users’ locations. Going forward, the company explained in a statement, it is evaluating “the best options that will allow Polar customers to continue using the Explore feature while taking additional measures to remind customers to avoid publicly sharing GPS files of sensitive locations.”
Strava too made changes to its app following its privacy debacle. The app removed an Activity Search function, which let users browse activities posted by others.* It had been a particularly useful tool for those visiting a new area, allowing you to see a potentially safe route others have run or ridden before. The app has also taken pains to ensure the public data that populates its Global Heat Maps is accurate, and updates its heat map monthly to account for changes users have made to their activity privacy settings.
Both recent incidents point out some glaring issues with activity-sharing apps. The first is that their fundamental utility and usability is tied to user data being open. In order to learn from, comment on, or like others’ activities, you typically need to relinquish some level of privacy. If your workouts start from home, that can mean revealing your home address—or at least its general area—to friends and strangers alike. And while that may seem innocuous, it has made some activity-sharing cyclists the target of bicycle theft. It also seems that despite numerous settings in place to give users better control over their privacy and data sharing, many don’t take advantage of those tools. Whether that’s due to a lack of digital literacy, a lack of instruction from the apps, or simply user apathy, it’s clear that there are inherent security problems with posting location-based activities to social networks.
Users of these kinds of apps have a few options. They can keep sharing their activities publicly, knowing that home or work addresses may be vulnerable to anyone with basic internet-sleuthing skills. If they share details such as what equipment they use, they risk opening themselves up as a target of theft. If they choose to share activities privately, either to no one or to an approved group of contacts, their data could still inadvertently be shared through screen grabs or general mishandling of user data. Privacy-conscious users could choose to share their activities to an app without a social-sharing angle, such as TrainingPeaks, Google Fit, or Apple Health. But in nearly any case, uploading activities to a cloud-connected service makes the information vulnerable to a data breach.
Handling activity data in the context of a social app becomes a complex issue when apps decide to use that information to provide additional services such as user searching, activity searching, or activity heat maps. Minor-seeming oversights can turn into state-level security fiascos. If you’re a top-secret intelligence operative, you simply shouldn’t use one of these services. The rest of us should only use them with full knowledge of the privacy risks.
Correction, July 24: This story misidentified the steps Strava has taken to update its app’s privacy settings.