Apple apologized this Wednesday for Siri's privacy issues, your smart assistant. He has done so hours after the dismissal of 300 workers hired in Cork, Ireland, to hear anonymous audio recordings that often contain personal data.
"We know that customers have had doubts due to recent news about people who listened to Siri's audio recordings as part of our Siri quality assessment process (in English," grading "). In response to their doubts, we immediately suspend the evaluation by humans of the requests to Siri and we have begun a thorough review of our practices and policies. As a result, we have decided to make some changes in Siri. "
"We have begun a thorough review of our practices and policies"
"For this we apologize"
In the published statement, Apple extensively reviews the privacy protections they claim to carry out in relation to Siri and explain how the data that users provide to the assistant makes it better.
They point out, for example, that when they store Siri data on their servers, they don't use it "to create a marketing profile and we never sell it to anyone." In addition, they ensure, Siri uses "the least amount of data possible" to obtain an accurate result and for greater protection, after six months, "the device data is dissociated from the random identifier".
"We have not been fully up to our high ideals and we apologize for it"
However, they emphasize that so that the assistant can carry out part of their personalized tasks "collects and stores certain information" of users' devices. "Siri also depends on the data of his interactions with him," they add, which includes "the audio of your request and a transcript of that computer-generated audio."
And continue: "Apple he sometimes uses the audio recording of a request, as well as transcription, in a machine learning process that "trains" Siri to improve. "
Despite the explanations, Apple acknowledge that as a result of his analysis they have concluded that they have not been "totally up to our high ideals and we apologize for it". And they announce that the evaluation program of Siri, currently detained, will resume in the fall.
What will they change in Apple?
Before resuming the so-called Siri quality assessment program in a few months, in Apple announce that they want to change three changes.
First, by definition, they ensure that "We will no longer keep audio recordings of interactions with Siri". Although, yes, they will continue using computer generated transcripts to help improve Siri.
Secondly, they point out that users will be able to activate and deactivate the option to help improve Siri by learning from audio samples of their requests. "We hope that many people choose to contribute to the improvement of Siri knowing that Apple respect your data and have strict privacy controls ", point.
Third, they point out that if a user decides to activate the option to help improve Siri they will only be the employees of Apple those who can listen to the audio samples of interactions with Siri. Therefore, subcontractors such as those dismissed in Ireland will not have access to the recordings. "Our team will delete any recording in which it is determined that there was no intention to activate Siri," they say.
Share Apple He apologizes for Siri's privacy issues and announces changes: "We haven't been fully up to par"