Real humans listen to what you say to Google Assistant


Your Google Home Mini might not be the only one hearing what you say to it.

When you talk to a Google Home smart speaker, you might have an unsuspected eavesdropper. The company admitted today that it uses humans to review a small percentage of voice interactions.

This is true for all types of voice-control systems, like Amazon Alexa and even Siri. But Apple’s system has some crucial differences.

Someone is listening

The world got a glimpse into the inner workings of Google Assistant. Its developer said today:

“We partner with language experts around the world to improve speech technology by transcribing a small set of queries — this work is critical to developing technology that powers products like the Google Assistant. Language experts only review around 0.2% of all audio snippets, and these snippets are not associated with user accounts as part of the review process.”

The company had to make this statement after one of those experts in the Netherlands leaked some recordings. Google says it’s taking steps to hopefully prevent a reoccurrence.

Rival Amazon supposedly employs its own surreptitious eavesdroppers. A report this spring claimed Amazon uses a team of thousands to listen to recordings made by Echo devices. These are also helping to improved the service’s ability to transform spoken commands into a usable form.

Google and Amazon listen a lot

All voice recognition systems have an activation code phrase, like “Alexa.” Smart speakers made by Google and Amazon aren’t actually very smart. They have no built-in way to recognize their code phrase, so all audio is routed through remote servers to handle this job.

Apple uses much more on-device intelligence. iPhone, HomePod and even AirPods recognize “Hey Siri” on their own. Only after that activation phrase has been recognized are commands sent to servers to handle the rest of the speech recognition task.

Sometimes recordings are saved, but not under the user’s Apple ID. A random device identifier (RDI) is used instead.

All it takes to erase these recordings is to toggle Siri Dictation off then on again. That also resets the RDI.

This process was described in a letter on privacy policies Apple sent to Congress last fall.

Via: Yahoo

 



Source link