A week after a newspaper report The guardian revealed that humans running Siri might be listening to private and illegal Apple announced that it will suspend the application for a review, and is already working on a software update to provide the option to completely disable this "monitoring". The news is in The Verge.
“We are committed to delivering a great Siri experience while protecting users' privacy. As we conduct a full review, we will suspend Siri's rating globally. In addition, as part of a future software update, users will be able to choose whether or not to participate in the rating, ”explained the Apple in statement.
What happens is that every time we say “Hey Siri”, the voice command is processed on the device but also semi-anonymized and sent to the cloud. A small percentage of this data is used to help train the neural network that makes Siri (and the dictation feature of Apple) accurately understand what we are saying. Someone, somewhere in the world, is listening to some of the "Hey Siri" commands and jotting down whether the assistant understood the person correctly or not.
Then the network of machine learning is adjusted and readjusted several times through millions of permutations. Changes are automatically tested against these “tracked” samples until a new ML algorithm produces more accurate results. This neural network then becomes the new baseline and the process repeats itself.
Apple, Google, Amazon, Microsoft, and any other company that produces AI assistants using machine learning to recognize speech or detect objects in photos or videos, you are using our cameras and microphones smartphones in this process.
Human AI training is a common practice. Tesla's self-driving capabilities are being built with humans training a neural network, looking at their customers' car chamber data and marking signals, tracks, other cars, bicycles or pedestrians. It is not possible to train an algorithm of machine learning high quality without humans overseeing the data.
Source: The Verge