This is how to protect your privacy when using smart speaker devices

Fragments of conversations are to smart speaker devices are heard and read by thousands of Amazon employees (Photo: Shutterstock)Fragments of conversations are to smart speaker devices are heard and read by thousands of Amazon employees (Photo: Shutterstock)
Fragments of conversations are to smart speaker devices are heard and read by thousands of Amazon employees (Photo: Shutterstock)

Smart speaker devices are a common gadget in most households nowadays, used to make calls, play music, set alarms and ask questions, among other handy functions.

But as impressive as the technology might be, the devices do spark some privacy concerns - and recent reports revealed snippets of conversations which are said to smart devices are heard and read.

Hide Ad
Hide Ad

Transcribed conversations

Owners of Alexa and Echo devices may be surprised to learn that fragments of their conversations are heard and read by thousands of Amazon employees.

The technology company has an international team of employees who work to help the smart devices better understand commands and develop new ways for the speakers to interact with users, recent reports revealed.

In order to do this, employees are required to listen to snippets of conversations which are said to smart devices.

Snippets from conversations are transcribed and annotated by Amazon employees, with the transcripts then used to ‘teach’ the smart devices to recognise more demands.

Hide Ad
Hide Ad

In response, an Amazon spokesperson said: "We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience.

"For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.

"We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow.

"While all information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it, customers can delete their voice recordings associated with their account at any time.”

Hide Ad
Hide Ad

How to stop protect your privacy

It is not possible to completely prevent snippets from conversations you have with your smart speaker devices from being used for annotations, but there are some settings which tighten your level of privacy.

If you own an Alexa device, you can turn off a feature that allows your recordings to be used in Amazon’s voice study.

To do this, follow these steps:

Open the Alexa mobile appSelect the ‘Menu’ button at the top left of the screenGo to Alexa Account > Alexa Privacy > Manage how your data improves AlexaTurn off ‘Help develop new features’ and ‘Use messages to improve transcriptions’ for all profiles on your account

Doing so will opt you out of some aspects of Amazon’s voice study, although their team may still analyse your recordings ‘by hand’.

This article originally appeared on our sister site, The Scotsman

Related topics:
News you can trust since 1859
Follow us
©National World Publishing Ltd. All rights reserved.Cookie SettingsTerms and ConditionsPrivacy notice