First Amazon, then Google and now Apple: It seems that every company with a digital assistant by its name uses humans to review a selection of the interactions users are having with their speakers and smart phones.
After a whistleblower informed the Guardian about the practice regarding Siri, the Apple confirmed that “a small portion of Siri's requests” are reviewed by contract workers, although the recordings are not linked to a Siri ID. Apple.
Information such as location, contact details, and other application data is recorded and included in the recordings, according to the anonymous informant who contacted Guardian.
“Siri's responses are reviewed in secure facilities and all reviewers are under an obligation to adhere to the strict confidentiality requirements of Siri. Apple”Says the Apple, adding that less than 1% of Siri's daily orders are reviewed this way.
The Guardian source says many of the recordings come from accidental activations: conversations about medical details, drug trafficking and sexual encounters were apparently captured and reviewed.
Just like in Amazon Alexa and Google Assistant, the purpose of this review process is to improve accuracy, says the Apple – Workers have to sort the clips, usually just seconds, to see if Siri has handled the interaction properly.
It is still a little disconcerting that real-life humans may be listening to your daily chat if Siri is listening from a distance. There is currently no way to choose to have your recordings reviewed this way.
Given the focus of Apple In user privacy, it may be necessary to take steps to further anonymize recordings before they are reviewed, or to give users more choices about how recordings are processed. As is often the case, however, some investigative reports are needed to bring the practice to light.