Whistleblower claims Apple contractors hear 'sensitive' Siri recordings

Most consumers are familiar with the AI-powered nature of personal assistants like Alexa and Siri, but few are aware of one particular privacy concern: audio recordings made available to humans. The latest report on this topic claims Apple contractors tasked with reviewing Siri recordings have heard a variety of sensitive content, including drug deals and personal health information.READ: Amazon's big privacy push includes deleting Alexa recordings by voice

As with Google and Amazon, Apple utilizes human contractors who listen to a small number of user recordings captured by a personal assistant — Siri, in this case. According to The Guardian, human contractors tasked with reviewing a small percentage of these Siri recordings are exposed to 'confidential' data as a result.

The information is said to come from a whistleblower working as a contractor for Apple. This unnamed individual claimed to The Guardian that human contractors are tasked with reviewing a small percentage of recordings in order to determine whether Siri was accidentally activated, among other things.

These manual reviews help improve the quality of the assistant, but users may not be aware that they're taking place. This reality reportedly concerns the whistleblower who detailed the job, one they claim may expose 'extremely sensitive personal information' about users to human contractors.

This sensitive information is reportedly recorded when Siri is accidentally activated. Ordinarily, users must use the wake phrase 'Hey Siri' to summon the assistant, but if an iPhone or other Siri-enabled device picks up a phrase that sounds similar enough, the assistant may start listening without the user's awareness of it.

The contractor claims there are a number of ways Siri is accidentally activated, including hearing the sound of 'a zip' and also if an Apple Watch is raised followed by the sound of someone talking. As a result, the report claims Apple contractors have heard recordings of drug deals, 'sexual encounters,' business arrangements, medical discussions with doctors, and more.

The HomePod smart speaker and Apple Watch are allegedly the biggest sources of accidental Siri activations and recordings.

Apple provided the following statement to The Guardian:

A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user's Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple's strict confidentiality requirements.