Skip to main contentSkip to navigationSkip to navigation
Apple Siri
The Guardian revealed that Apple’s contractors ‘regularly’ hear private information on Siri, including drug deals, medical details and people having sex. Photograph: Oli Scarff/Getty Images
The Guardian revealed that Apple’s contractors ‘regularly’ hear private information on Siri, including drug deals, medical details and people having sex. Photograph: Oli Scarff/Getty Images

Apple halts practice of contractors listening in to users on Siri

This article is more than 4 years old

Tech firm to review virtual assistant ‘grading’ programme after Guardian revelations

Apple has suspended its practice of having human contractors listen to users’ Siri recordings to “grade” them, following a Guardian report revealing the practice.

The company said it would not restart the programme until it had conducted a thorough review of the practice. It has also committed to adding the ability for users to opt out of the quality assurance scheme altogether in a future software update.

Apple said: “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Contractors working for Apple in Ireland said they were not told about the decision when they arrived for work on Friday morning, but were sent home for the weekend after being told the system they used for the grading “was not working” globally. Only managers were asked to stay on site, the contractors said, adding that they had not been told what the suspension means for their future employment.

The suspension was prompted by a report in the Guardian last week that revealed the company’s contractors “regularly” hear confidential and private information while carrying out the grading process, including in-progress drug deals, medical details and people having sex.

The bulk of that confidential information was recorded through accidental triggers of the Siri digital assistant, a whistleblower told the Guardian. The Apple Watch was particularly susceptible to such accidental triggers, they said. “The regularity of accidental triggers on the watch is incredibly high … The watch can record some snippets that will be 30 seconds – not that long, but you can gather a good idea of what’s going on.

Sometimes, the Apple contractor said, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

Although Apple told users that Siri data may be used “to help Siri … understand you better and recognise what you say”, the company did not explicitly disclose that this entailed human contractors listening to a random selection of Siri recordings, including those triggered accidentally.

“Too often we see that so-called ‘smart assistants’ are in fact eavesdropping, said Silkie Carlo, the director of the UK campaign group Big Brother Watch. “We also see that they often collect and use people’s personal information in ways that people do not know about and cannot control.”

She added: “Apple’s record on privacy is really slipping. The current iOS does not allow users to opt out of face recognition on photos, and this revelation about Siri means our iPhones were listening to us without our knowledge.”

The company is not alone in taking flak for its undisclosed quality assurance programmes. Both Amazon and Google also use contractors to check the quality of their voice assistants, according to reports in Bloomberg and on the Belgian TV channel VRT, and contractors from both companies have expressed discomfort at the nature of overheard recordings.

The revelation of Google’s programme in particular caused concern due to the fact the news report was accompanied by a leak of more than 1,000 audio clips, revealing that at least one in 10 had been captured accidentally.

That prompted the data protection commissioner for Hamburg in Germany to ban Google from carrying out those checks for three months, citing a likely breach of the general data protection regulation (GDPR). The commissioner cited an “urgent need to protect the rights and freedoms” of Google Home users in enacting the temporary ban, which would otherwise be the responsibility of the Irish data protection commissioner, the lead controller for Google, as well as Apple, in the EU. Google says it had already ceased the practice on 10 July, when it learned about the leak of audio clips.

Amazon and Apple escaped the ban, Hamburg’s commissioner said, because those companies have their German head offices in Munich and are thus covered by a different commissioner under that country’s privacy laws. But the commissioner called on other regulators to “quickly check for other providers of language assistance systems, such as Apple or Amazon, to implement appropriate measures”.

Amazon is the only major provider of voice assistant technology still using humans to check recordings in the EU.

Most viewed

Most viewed