Amazon has repeatedly denied Alexa, which is an increasingly common feature in smartphones, speakers from third-party manufacturers and other connected devices as well as the company's own Echo smart speaker range, is always listening to user conversation, reiterating it only begins recording audio once it hears its default "wake word" "Alexa".
According to Bloomberg, teams of employees listen to voice recordings in the Echo system and feed the transcripts back into the software "to eliminate gaps in Alexa's understanding of human speech". The ex-Amazon workers also said that numerous amusing clips get distributed between employees worldwide in an Amazon internal chat room.
Even more concerning is the claim that these workers have been privy to recordings of "possibly criminal" acts.
But Amazon said it takes "security and privacy of our customers' personal information seriously". This team has access to voice recordings from real customers using Alexa-powered devices in their homes and workplaces (only Echo speakers are directly mentioned in the report, though Alexa also runs on mobile phones and numerous third-party devices).
Amazon's Alexa voice system continually "listens" for a chosen trigger word, such as "Alexa", to initiate a request. This revelation today at least partly confirms the validity of their concerns.
We've long known that smart devices can be triggered accidentally.
The employees include both full-time Amazon employees and contract workers located across the globe, from Boston to India, the new service notes. In one instance, two workers heard what sounded like a possible sexual assault but were told that it wouldn't be appropriate to intervene.
'We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system, ' the spokesperson said.
But they insisted that "all information is treated with high confidentiality" using "multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it".
The moderators reportedly have a chatroom where they can ask for help, or just share "amusing" recordings. All companies say the clips lack personally identifiable information.
Bloomberg said that Alexa auditors don't have access to the customers' full name or address, but do have the device's serial number and the Amazon account number associated with the device.
Moreover, Amazon Echo users also have the option of disabling the use of their voice recording for training purposes. Visit Amazon.com's Alexa privacy settings FAQ page. Apple disclosed they store recorded information for six months, after which the audio is stripped of all identification information, but can be still used for machine learning.
The NSA also made thousands of more requests for user information during that time period and an I-Team review found the tech companies handed over the information requested by local and federal agencies in more than two-thirds of those cases.