fbpx

Back on July 7th, we reported on the revelation that Amazon’s Alexa devices were recording everything that was being said, often storing data until deleted manually. Only a week later, we highlighted how Google was recorting private conversations, often without a ‘wake word’ being given to wake the smart speaker up first.

Well, now Apple’s voice assistant Siri has joined in on the controversy. So, if you’re using a smart speaker in the home, it’s likely your conversations could be being recorded and even stored without you realising.

Siri ‘listening in’ on voice commands

According to reports by The Guardian newspaper, Apple have now been found to listen in on voice commands issued via Siri, its voice assistant.

According to The Guardian, short snippets of audio recordings are forwarded on to contractors that are given the task of reviewing the audio and grading it for its accuracy. Supposedly, this involves determining whether the prompt wasn’t intentional, as well as whether Siri successfully fulfilled its request.

‘There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad (…) It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.’

One of the sources for The Guardian
An Apple Watch, with built-in Siri voice assistant

But much like the issues discovered with both Google and Amazon’s devices, it’s said that Apple’s data has also unintentionally scooped up audio recordings that some would consider ‘private’. According to a source included in the report, who has gone unnamed, these recordings include sexual discussions, criminal activity, and confidental doctor-patient discussions.

‘A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID,’ said the company (…) Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.’ 

Apple responds to The Guardian

One of the sources that provided information to The Guardian was said to have reported that the most common offenders are the Apple Watch and Apple Homepod. And to quote this source directly, the “regularity of accidental triggers on the Watch is incredibly high.” This is likely attributed to the way that the microphone is activated on the watch, which can be done by simply lifting the device to your face.

Revelations like these are bound to come up from time to time given the explosion in smart home tech. It’s unlikely that there are any sinister motives behind this data capture, especially when it’s linked to 3 of the biggest hardware and software companies around. But it highlights how we should be careful about the security risks involved with new tech that hasn’t existed for long.