Apple announced today that it plans to make some changes to its Siri voice-controller intelligent assistant for iPhone, iPad, Mac, Apple TV, and Apple Watch devices to protect users’ privacy.
If you’re reading the news lately, you may know that Apple was recently sued for listening to users’ private conversations through the Siri audio recordings that were being sent to Apple as part of their quality evaluation process for Siri, which the Cupertino, California-based company calls grading. Therefore, they immediately suspended the human grading of Siri requests and apologized for mishaps.
“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result,” said Apple in a press release.
The grading system to make the voice-controller Siri intelligent assistant more reliability involved Apple’s employees and employees of a third-party contractor reviewing a small sample of audio from Siri requests, as well as their computer-generated transcripts. But Apple says that the data stored on their servers wasn’t used to build a marketing profile, nor to sell it to third parties.
“How Apple is making Siri more privacy-aware”
Apple is always keen to develop new technologies that improve its software and devices, and later this fall when the upcoming iOS 13, iPadOS 13, macOS Catalina 10.15, watchOS 6, and tvOS 13 operating system will hit the streets, the tech giant aims to resume its grading program for improving Siri with some new privacy-aware changes that put the customer at the center.
These include ditching audio recordings of Siri interactions, the ability for users to opt in if they want to send audio samples of their requests to Apple to improve the intelligent assistant, as well as the ability to opt out whenever they want, and only allowing Apple employees to listen to audio samples of the Siri interactions, which will be deleted when they’re determined to be an unwitting trigger of Siri.