A few weeks after filtering the program to improve Siri, new information has emerged regarding the rules behind the voice assistant of Apple. Documents offered to The guardian by the same external contractor indicate that Siri is designed to stay neutral to questions that can be controversial.
The information reveals that developers follow three rules for responses: not participate, divert and inform. When you ask Siri what he thinks of feminism, his answer is "I believe in equality. All voices are worthy of the same respect." Other issues such as white supremacy or harassment are diverted with responses such as "I was thinking of fire trucks."
The responses of the assistant have changed over time and many of the controversial issues have been rewritten to address the issue, but without participating directly.
The leaked document says that Siri doesn't have a point of view, is not human and does not have gender. Its main directive is that of be useful at all times and aspires to respect the three laws of robotics by Isaac Asimov, only adapted by Apple with the following approach:
- An artificial being must not represent himself as human, nor by default allow the user to believe he is one.
- An artificial being must not violate the human ethical and moral standards that are commonly maintained in its region of operation.
- An artificial being must not impose its own principles, values or opinions on a human.
Faced with this new filtration, Apple replied that they work hard for ensure Siri answers are relevant For all customers.
Our approach is to be objective with inclusive responses instead of offering opinions.
The information is part of a compendium of documents sent to the British newspaper by a contractor who worked for the Siri improvement program.
After revealing that employees listen to audio fragments collected during interactions with the assistant, Apple He decided to suspend the program and apologized for not being transparent regarding the evaluation and improvement processes.