Although customers theoretically have the final word over their digital assistants, they’re not given the option of blocking the recordings outright. The companies say the data is collected and reviewed to determine how accurately the artificial intelligence devices understand language and interpret requests — especially after the wake word such as “Hey, Siri” or “Alexa” or “Okay, Google” brings it to life.
California Assemblyman Jordan Cunningham (R), who represents San Luis Obispo County and northern Santa Barbara County, told The Washington Post that these companies are giving consumers a false choice between technology and privacy.
“We can protect people’s individual privacy and support the technologies and make sure they flourish and develop,” he said. “I think we can have both.”
Cunningham proposed an “anti-eavesdropping” bill earlier this year that would require the makers of smart speaker devices to get permission from users before recording and to remove identifying information from any personal data. It passed the state Assembly with bipartisan sponsorship and will be heard in the Senate in January. The Illinois Senate passed a similar bill, and it is now under the state House review.
The California lawmaker said his bill would set “a baseline of trust that I think has been breached by these companies” and would give consumers more control over their own data.
Cunningham said he and his wife have six Alexa devices in their home but didn’t know about the recording capability when they purchased them.
“I’m a former prosecutor,” he said. “We had to get warrants to record people’s conversations on their cellphones.”
A class-action lawsuit was filed in California by a parent and child on Thursday that accuses Apple of “unlawful and intentional recording of individuals’ confidential communications without their consent,” starting in October 2011.
In May 2018, a Portland, Ore., family notified Amazon after a work contact in Seattle told them he had received audio files of their recorded conversations via Alexa, Washington state’s KIRO 7 reported. (Amazon founder and CEO Jeff Bezos owns The Washington Post.)
“I felt invaded,” one of the family members told KIRO 7. “A total privacy invasion. Immediately I said, ‘I’m never plugging that device in again, because I can’t trust it.’”
Shelby Lichliter, a PR manager for Alexa, told the news station that the Echo misheard the wake word.
All three companies allow users to manage their voice recordings after the fact, but they each offer varying degrees of control. Here’s how it’s done.
Earlier this month, Amazon changed its privacy settings to allow users to opt out of its voice recording reviews. Amazon said its Alexa Data Services team reviews 1 percent of user voice recordings and that they don’t have access to data that would link those recordings to user accounts.
Amazon processes and sends the voice recordings to its cloud, which can then be managed on the Alexa app or via the user’s Amazon account. Hands-free devices such as the Amazon Echo will light up blue or play an audio tone when activated (other options for wake words include “Amazon,” “Computer” and “Echo”) and in the process of recording, the company said. With the most recent app update, you can see what’s been recorded via the Alexa app:
- Click Settings > Alexa Privacy
- Select individual entries or all the recordings at once and click “Delete Voice Recordings” for any data you don’t want
You can also access all your Alexa products online and delete voice recordings for each. Click the “…” logo on the left side next to the device name and select “Delete voice recordings.”
In May, Amazon added a feature that allows users to delete their voice recordings by command: “Alexa, delete what I just said” or “Alexa, delete everything I said today.” To enable this, visit Settings > Alexa Privacy > Review Voice History in the Alexa app or online.
To opt out of Alexa sending voice recordings and data to Amazon:
- Visit the Alexa Privacy page
- Click “Manage how your data improves Alexa”
- Turn off “Help Improve Amazon Services and Develop New Features”
- Turn off “Use Messages to Improve Transcriptions”
Siri and dictation
On Aug. 2, Apple suspended its reviews of voice recordings by human contractors and said users could opt out with a future software update.
According to Apple’s terms for Siri and dictation, what users say and dictate are recorded and sent to Apple, along with other information, such as names, contacts and their relationships to you, Home-Kit-enabled devices in the user’s home, and what other apps are installed on the device. Apple did not respond to requests for comment.
Right now, users can’t actually access or delete their voice recordings through Apple’s Siri; they can either stop using Siri or delete the Apple account. However, Apple’s terms say that if both Siri and dictation are disabled, the company will delete user data and recent voice recordings. Anything older that has been disassociated with the original user — including audio files, transcripts, the user’s location when the request was made and performance statistics — can be used for Apple’s improvement of Siri and dictation.
To disable Siri in iOS 11+ on Apple devices:
- Settings > Siri & Search
- Turn off “Listen for ‘Hey Siri’” and “Press Side Button for Siri”
- Confirm “Turn Off Siri”
Then, users can also disable recording from dictations.
- Click Settings > General > Keyboard
- Turn off “Enable Dictation” and confirm
In mid-July, Google suspended its policy of reviewing Google Assistant voice recordings across the European Union for at least three months, and a German privacy regulator launched an investigation Aug. 1.
“Shortly after we learned about the leaking of confidential Dutch audio data, we paused language reviews of the Assistant to investigate,” a Google spokesperson told The Washington Post. “We are in touch with the Hamburg data protection authority and are assessing how we conduct audio reviews and help our users understand how data is used.”
David Monsees, a product manager, said the company only reviews about 0.2 percent of all voice recordings.
“Audio snippets are not associated with user accounts as part of the review process,” Monsees said.
Users can access their data history from any of their devices that are tied to their Google account through the Google activity page. Google said voice recordings are disabled by default when users create a Google account and they have to opt in for them to be stored in their account.
- Scroll to “Voice & Audio Activity” (If it says ‘(paused),’ it’s already disabled, and if not, you can disable the blue slider.)
- Click “Manage Activity”
- Use the search bar to search through the history by date or keywords and delete