After coming under fire for admitting that human reviewers are listening to “snippets” of recordings from Google Assistant, the San Francisco tech giant stopped the potentially invasive practice by not saving the voice recording in default; instead, the Google Assistant will have an opt-in system where users need to agree to the recording before Google can save them.
Now, users will need to opt-in to the Voice & Audio Activity (VAA) program when they set up their Google Assistant for Google to save the recordings they made from the interaction between GA and the user. The opt-in system also includes a disclaimer that the recordings that will be saved will be reviewed by human listeners.
For those who already have set up their Google Assistant and are existing Voice Match users, they will be prompted by an option to reconfirm their agreement to have their recordings be saved by Google. They can also opt-out to the service once they are asked to reconfirm.
The Voice & Audio Activity (VAA) setting, when enabled “helps improve the Assistant for everyone by allowing us to use small samples of audio to understand more languages and accents.” However, after the issue came out, Google said that users could review their VAA settings and customize its privacy according to the user’s preference.
“If you’re an existing Assistant user, you’ll have the option to review your VAA setting and confirm your preference before any human review process resumes. We won’t include your audio in the human review process unless you’ve reconfirmed your VAA setting as on,” Google said.
“One of the principles we strive toward is minimizing the amount of data we store, and we’re applying this to the Google Assistant as well. We’re also updating our policy to reduce the amount of audio data we store vastly.”
The new changes in the human review policy of Google Assistant come after the company, along with other tech giants like Amazon and Apple, came under fire after it was revealed that they these companies had hired human contractors to listen to the recordings made in their artificial intelligent assistants without the knowledge of the users.
The companies defended their policies by saying that reviewing the recordings are necessary for artificial intelligence to improve. The human reviewers’ role is to rate the performance of the assistance in different aspects like the accuracy of response and its ability to recognize the voice of the device owner.
Amid the controversy surrounding tech companies and the way that they handle consumer data, Google clarified that they had anonymized the recordings, and they only listen to “snippets.” They noted that those were also randomly snipped.
“We partner with language experts around the world to improve speech technology by transcribing a small set of queries – this work is critical to developing technology that powers products like the Google Assistant,” Google said.
“Language experts only review around 0.2% of all audio snippets, and these snippets are not associated with user accounts as part of the review process.”
The issue has come to light after one of their employees have exposed their operations by leaking consumer data from several European countries.
“We just learned that one of these reviewers had violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again,” they added.
After the issue blew up, Google has initially halted the operation and stopped saving the recordings made by its users through Google Assistant. Furthermore, they also announced that they would be reducing the sensitivity of the AI to respond to the “Hey Google” command to prevent unnecessary recordings.