Future Tense

Microsoft Contractors Have Access to Some Skype Call Recordings and Cortana Requests

A webcam in front of the Skype logo
What service isn’t watching or listening to you? Reuters/Dado Ruvic

You might want to think twice about using Skype’s automated translation software to have a private conversation—or say, a more intimate chat. One of Skype-owner Microsoft’s contractors could be eavesdropping on snippets of what you presumed was a private call. Add Skype and Cortana to the ever-growing list of voice recognition services that might be (and most likely are) recording you.

According to a report from Vice’s Motherboard, Microsoft contractors working on Skype have access to parts of some users’ personal calls. The third-party employees reportedly are able to listen to 5-10–second segments of calls, though some of these available soundbites are longer. The contractors can access recordings gathered through Skype’s automated translator and Microsoft’s Cortana voice assistant. Motherboard even obtained a cache of those audio recordings, and that feat alone, the whistleblower who provided them pointed out, should worry those concerned about their privacy. Some of those recorded conversations are intimate in nature, covering topics like sex, weight loss, and relationship issues. Other recordings collected through Cortana include addresses—even though Microsoft said it removes personally identifiable information—and search queries about porn. (OK, no judgment at all—but I must ask who uses Cortana to look up porn? I suppose if any tool can search the internet, some users will use it for porn. It’s the law.)

Advertisement
Advertisement
Advertisement
Advertisement

Skype and Microsoft’s terms say the company uses some recordings to analyze translations and improve the service. Skype’s translation function and Cortana operate with artificial intelligence. Contractors receive a sample recording and then are asked to either select a correct automated translation of it or to provide a better one, to train the computer system to improve. In a statement to Vice, Microsoft said it “gets customers’ permission before collecting and using their voice data.” But the app’s user agreement does not state that analysis of touchy conversations—some of which were described as “phone sex”—could be performed by third-party human contractors, some of whom are working from home and joking with friends about what they hear. And, as the tech website LifeHacker pointed out, there doesn’t appear to be a way to opt out of the Skype translator improvement program. Users can, however, opt out of having Cortana commands listened to. Furthermore, the privacy concern appears limited to Skype calls that utilize the app’s automated translation.

Advertisement

Microsoft’s practices aren’t unique, and this isn’t the first report of this kind. Many apps that use our voices to provide convenient services have to be trained somehow, and most of the time computers can’t train themselves. There are typically humans involved somewhere, somehow along the way. Apple, Amazon, and Google faced similar criticisms after it was reported that each used human reviewers and transcribers to improve their respective Siri, Alexa, and Google Assistant services, prompting concerns that their recordings of users’ personal requests and private conversations—some including confidential medical information—were being listened to by human ears, too. Apple and Google temporarily suspended their contractor review and analysis practices. Amazon changed its policy to explicitly say that humans might review Alexa recordings and to provide a way for users to opt out. The revelation about Skype, though, is particularly concerning because person-to-person Skype calls are far more likely to be sensitive than an Alexa or Siri request—though those clearly can be sensitive, too.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement