Amazon employs thousands of people around the world to improve Alexa, the assistant bundled with the Amazon Echo line of speakers. Those listening to your commands transcribe the audio, add annotations and then feed the output back into the software in order to “eliminate gaps” in Alexa’s understanding which in turn provides a better experience for users.
According to Bloomberg, who spoke to several people who do some of the work on the audio, teams consist of full-time employees and contractors. Employees work from several outposts all around the world in places such as Boston, Costa Rica, India, and Romania. Bloomberg spoke to employees in a nondescript building in Romania who had signed non-disclosure agreements and work through around 1,000 clips in their nine-hour shift.
Teams use an internal chat room whenever they struggle to decipher what is being said, although one worker based in Boston said the recordings are mostly monotonous requests for music. With that said, Bloomberg alleged that employees do sometimes share amusing recordings within the group chat for the rest of the team to have a laugh at. At other times, workers say they’ve heard more distressing things such as sexual assault, the chat room is then utilised to relieve their stress.
Amazon claims that it has procedures in place for workers who hear distressing material but two employees have previously been told that it’s not Amazon’s place to interfere. Commenting on these revelations, an Amazon spokesperson said:
“We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.
We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it.”
Analysis of voice clips also occurs at Google and Apple too. Google says that clips it records from Google Home devices and Android phones are not associated with identifiable information and that it distorts the audio to disguise the customers" voices. In its security policy, Apple says that it also doesn’t link recordings from HomePods, iPhones, or iPads to identifiable information but instead uses a random ID number which is reset each time Siri is turned off.
Privacy options vary by company but Google, for example, allows you to head into your My Activity settings to find and delete any audio recordings that have been sent to the cloud. It’s also important to remember that smart home speakers do not send information to the cloud until they hear their respective keywords. For those bothered about the levels of privacy, it might be worthwhile to disable things such as continued conversations where the device continues to listen for a further ten seconds or so after you’ve already asked your first query.