Site navigation

Apple Workers ‘Regularly Hear Confidential Details’ Via Siri Recordings

Dominique Adams

,

Contractors hear couples having sex, medical details and drug deals as part of their job providing quality control for the Apple’s voice assistant Siri, says whistle blower. 

An Apple employee has anonymously raised concerns over a lack of disclosure in regards to the company’s use of the data it collects via its voice assistant, Siri.

But Apple says that the data is useful to help improve Siri’s dictation and its ability to understand its users.

The gathered data is reviewed by contractors working for the business around the globe to grade Siri’s responses on a variety of factors such as whether the activation was accidental or deliberate, if the query was something Siri could be expected to help with, or whether or not its response was appropriate.

Speaking to The Guardian, the whistle blower voiced concern that Apple does not make it explicitly clear that humans listen to the pseudonymised recordings.

Apple told The Guardian: “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Recommended 

The company says that less than 1% of daily Siri activations are used for grading, and that the recordings used are generally only a few seconds long.

According to the whistleblower: “The sound of a zip, Siri often hears as a trigger”. In addition the service can also be activated in other ways including when an Apple watch is lifted and then hears speech the voice assistant is activated automatically.

The whistleblower said: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

Apple says the Siri data “is not linked to other data that Apple may have from your use of other Apple service”, which, therefore, makes the recordings annonmymous and hard to link to other recordings.

Accidental activations are responsible for most mistaken recordings of highly sensitive data, and according to the anonymous source the Apple Watch and HomePod smart speaker are often the source of these clips. “The regularity of accidental triggers on the watch is incredibly high,” the company said.

“The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on. You can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal…you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

The contractor said that staff were encouraged to report such incidents, “but only as a technical problems” as there are no specific procedures in place to handle it. “We’re encouraged to hit targets, and get through work as fast as possible. The only function for reporting what you’re listening to seems to be for technical problems. There’s nothing about reporting the content.”

Aside from the discomfort the workers felt listening to the clips, the whistleblower felt the need to go public over fears the data could be misused. “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” they said. “It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.

“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

The contractor felt that, due to the high number of accidental recordings, the company should be more explicit with its customers that human oversight exists. They also said the company should stop publishing some of Siri’s more jokier responses as it was misleading. For example, Siri responds to “are you always listening?” with “I only listen when you’re talking to me.”

Amazon and Google both have similar practices and have been criticised for allowing staff to listen to recordings of their customers. However, unlike Apple, the other two allow customers to opt out of some uses of their recordings. Apple does not offer a choice to disable Siri entirely. This lack of clarity has been called hypocritical by some since one of Apple’s market differentiators is that it protects its users privacy more so than its competitors.

Dominique Profile Picture

Dominique Adams

Staff Writer, DIGIT

Latest News

%d bloggers like this: